In article <4730f514@news2.actrix.gen.nz>,
~misfit~ <misfit61nz@yahooligans.co.nz> wrote:
Somewhere on the interweb "Patrick Vervoorn" typed:
I got myself a G0 Q6600, not because it can be overclocked better, but
because I read it ran cooler, and consumed less power.
The G0 stepping only runs cooler and consumes less power (perhaps with 80%
of samples) when overclocked. At stock, without BIOS tweaks, it runs to the
Intel hard-coded specs which are the same for all steppings of the Q6600.
Same vcore = same heatoutput. Energy in = energy out.
As far as I understood it, It's apparently below the 'magic'
power-consumption threshold (so that you can put it in cheaper, less well
cooled cases), but I can't be bothered to look up what exactly that was.
At least that's what I read, with no overclocking in the picture.
I've also seen
some overclocking results, and I don't really think it's worth it.
Aye. OC'ing is not for everyone. However, you have a CPU that could quite
easilly run at 3.2GHz with the only added expense being a better than stock
cooler (assuming you have good case ventilation). As you're running BOINC at
100% then you wouldn't experience the issues that are worrying me.
I suppose it could, but I've also never seen the need to overclock my
previous machine (a P4 2.4GHz), and I also see no need to overclock this
one. As I said, perhaps if it runs out of steam, but by that time, either
a major upgrade (like perhaps an FSB1333 umpteen-Core CPU or a GeForce 9
or 10 series GPU) or a totally new system will be the better options,
instead of squeezing out a few more % of performance by overclocking some
vital parts.
For now, it runs what I want to run, as fast as I'd like it to run.
If/when it runs out of steam, I'll see what I can gain by
overclocking it's parts.
Sure, fair enough. To be honest I have no need of more power than my CPU
gave at stock speed. My main reason for OC'ing was to be able to do more
work for SETI, while still having power to spare.
While my point is that the system is doing a fine enough job for the cause
of Seti as it is now. ;)
Isn't your system consuming less power when you don't overclock it? How
about 'downclocking' it? Can't you downclock it in such a way that you can
let it run SetiBOINC, at 100%, in the downclocked state, so that the
entire system performs at the percentage you want it to, and also
consuming the amount of power you want it to?
I started a team in my local computer usenet group which, at it's height,
had 50+ members. However, on changing to BOINC, a lot of the folks dropped
out.
The NZ team you are a member of? the team racked up a nice amount of
credit. :)
You can find mine too, using
'Patrick Vervoorn', but I have hidden my computers, so there's not
much to see there.
Indeed. You've certainly crunched some units, with a lot of RAC. That's a
fast machine. :-)
Actually, the main crunchers are the Q6600 machine, my 'previous'
Game-Machine (P4-2.4GHz, 1.5GB) and a P4-2.8GHz-HT machine somewhere else.
I had a fixed amount of money reserved for the CPU; for the cost of a
Q6600 I could've also acquired a 3.0GHz E6850. That would've perhaps
served current games better than the Q6600, but I counted on engines to be
able to use more than 2 cores, and I also had SetiBOINC in the back of my
mind, knowing that would like 4 x 2.4GHz cores more than 2 x 3.0GHz.
Seems you joined Setiatome Classic about half a year before I did. ;)
Yeah, within around a month of them starting up. That's how long it took for
me to find out, via my local monthly computer magazine.
I can't really recall how and when I ran into it, but I apparently found
it cool enough to join. ;)
Perhaps they did the best they could using WinXP (Just speculation
from my side, no idea if they could've done it better)?
I have games that, when I exit them and look at the CPU usage graphs for
both cores, show fairly consistent usage of CPU, a smooth line with maybe
+/- 10% of the CPU being used. Baseline varying from game to game. Nothing
like the saw-tooth graph I see from SETI/BOINC.
But perhaps these games are limited/throttled in another way, for instance
by the amount of data they can move towards the GPU, just to give an
example? Or they are idling, waiting for the next frame to be available?
Or waiting for VSync? Those are pretty usable, fine-grained things to
synchronize with (~60-100Hz, depending on your display), so that it can
appear the game just uses 10% of your CPU. But rest assured that when the
code of these games is 'let loose' it will use, during a short burst, 100%
of both these cores.
When I monitor the behaviour during HL2: Ep2, it seems to use just a
single core (on my machine at least). Bioshock seems to limit itself to
using about 50% of all cores.
SetiBOINC isn't limited in that way, there's nothing it has to wait for.
Perhaps if you can write a script or something that can toggle SetiBOINC's
status 50Hz you might see something like this...? Of course, the 'toggling
app' should be smart enough to not use all the left-over cycles. ;)
What does
that same option do on a machine running Linux, for instance?
Other than once a few years back I've not tried Linux. Perhaps it's time I
gave it another go. I keep downloading ISOs...
I think you can download a fairly small 'Live CD' for Linux with which you
can experiment with this. Alternatively, two floppies worth of Debian
bootcode allow you to install the entire OS from the Internet (That's how
I installed a Stable Debian on the P1-133 MHz machine). Are you still
using that monthly-limited ADSL subscription, or have things improved Down
Under? ;)
LOL! That's an understatement. I have <looks around> 9 PCs in this room
alone that are ready to run. All over 1GHz CPU, a lot of them Tualatin
Celerons, others AMD Athlons. I must get around to getting rid of some. The
trouble is, nobody wants to pay any money for them, with new, low-end
HP/Compaq/Dell machines being so cheap. Yet they're still excellent internet
appliances, in fact far more powerful than that. I hate to see good working
machines scrapped. (I built most of these from parts that my friends in IT
gave me, parts destined for the scrap-heap.)
Same here. I'm also running SetiBOINC on 2 x P3-700 machines, 1 x
AMD-1.4GHz, 1 x P4 1.7GHz, 1 x P2-400MHz and a P1-133MHz. Beyond that,
it's also running on another AMD ~700MHz machine somewhere else, besides
the main crunchers I mentioned above. All running at ~100%.
I think it's safe to assume Intel considered a single thread OS or
just a single-threaded application running on these CPUs, so I think
you're worrying too much.
That is a trait of mine. Especially when I'm not easilly able to replace the
thing about which I'm worrying.
I'm pretty confident Intel thought this over, but if you want confirmation
of this, try it in one of the intel-groups, or perhaps something like
comp.arch (though one should perhaps be very careful treading there :))
Of course, if these extremes are happening because you overclock, I
suppose all assumptions Intel made are out of the door. ;)
No, I considered that and ran it back at stock speed for a while. The thing
behaved the same, albeit at slightly lower temperatures. The fluctuations,
which are my main worry, still occured at a rate I found disturbing. I
haven't raised the core voltage at all to reach this speed so it's not what
you'd call an extreme overclock by any means.
I do hope you are also aware you could just be observing an artifact of
the temperature sensors in the CPU? I.e. during/after the 'speed bump'
they do not report the correct temperature, and perhaps the temperature
isn't fluctuating as quickly as you think it is? Also something to perhaps
consult the experts about; I can only speculate.
I've got the means to overclock it quite nicely (an nVidia 680i based
mainboard is underneath the CPU), but I have no incentive really.
Same for the graphics card (8800GTX); plenty of options to overclock
it, but why risk it?
Well, you have all the power that you could need, both CPU and graphics
already.
That was the main idea. ;) We'll see how long it holds out. It does do a
nice job of the Crysis demo, although the raw frame rate isn't to write
home about (with all settings at High, 1680x1050, I get about 30fps,
though the game does look pretty smooth due to their motion-blurring in
the engine).
Coincidently, an 8800GTX is sitting on the chair next to me, in it's box
with a NZ$920 sticker on it. I'm doing a re-build for a friend this weekend,
his system has to go into a new case as the 8800GTX is a full-length card
and won't fit his existing case. For doing this for him he's giving me his
"old" 7800GT (I get a lot of my hardware this way, in payment for
building/upgrading machines for gaming friends).
The 8800GTX is a mother of a card to be sure. I just about managed to
squeeze it into my case, and when running a heavy application, it blows
out a LOT of heat out of the backside.
I managed to acquire a 256MB 7600GT when my 128MB 6600GT broke within
warranty, and the factory apparently didn't have a replacement 6600GT to
ship to me, so they shipped a 256MB 7600GT instead. That card is currently
in the P4-2.4GHz machine, which is running Vista as an 'experiment'.
Before I commit anything else to it, I want to make sure it's stable.
Anyway, congrats on the 7800GT; should be a nice card.
Gosh but I'm good at digression huh?
Always nice to get some background, my apologies for snipping it out
though. ;)
Regards, Patrick.