You are viewing a single comment's thread from:

RE: CPU & GPU Mining from your laptop

in #altcoin7 years ago

All of my laptops are HP's, Dells, and Sony's. I have yet to find any laptop that can my 18+ hours a day every day without overheating issues. Even if you think your computer is not having heating issues, the life span of your processors and GPU's will show otherwise.

Sort:  

I've had three dells (all older core 2 duos), a core 2 duo MacBook Pro, a newer i7 MacbookPro, an older Acer, a much older dell (talking Pentium III here), an HP (AMD based), a Lenovo, various G4 and G3 based powerbooks and a couple I am probably forgetting. All of them have run long stretches of 24 hours/day 7 days/week and only two have had overheating issues. One was one of the Core 2 Duo based Dells that had known problems with heat cycles and soldering related to the Nvidia GPU even for people who didn't push them hard and the other was the i7 based MacBook which functions fine, you just can't use the GPU without the CPU throttling immediately. Some of them have done mining but mostly I have used them to run various BOINC projects but these push the CPU and GPU at least as much as mining does.

Most of them, with the two exceptions I mentioned, handled running at 100% cpu utilization and 100% GPU utilization for days at a time without issue.

The newest laptop I currently use (the i7 based MacBook Pro) is about four years old. The next laptop I buy will be a gaming laptop (perhaps an Alienware but I haven't decided and won't buy one until I really need one) because I know it will have decent cooling. Laptops having heat issues really seems to be a (relatively) recent phenomenon as manufacturers have pushed thinness and lightness a little too far. Otherwise, I've found that as long as vents are not obstructed, the fans are working as they should and they aren't in a particularly harsh environment, laptops don't have heat issues. If they do, like I said, I consider them defective (others may have different opinions but that's how I view it and I purchase accordingly). A MacBook "Pro" that costs thousands of dollars should not have to throttle the CPU just to run one of its two "mobile" GPUs at 100%.

Other than the defective Dell above (and it was defective, Dell even extended the warranty on that particular model), I have never had a desktop or laptop CPU or GPU die because of heat issues no matter how hard I've pushed them. I've lost more power supplies and hard drives than I can count and a few motherboards, but never a CPU or GPU. Even that Dell wasn't the GPU dying, it was crappy solder holding it on the motherboard. I'm sure that the higher heat does shorten the life of these components but it obviously hasn't been enough to matter if I'm still running them many years after they are obsolete. The desktops I've had I build myself and always make sure they have reasonable cooling. With laptops you kind of have to rely on the manufacturer (or take steps like you've mentioned). But if somebody else has built it, as with a laptop, then I expect them to do it right in the first place.

Sorry, don't meant to rant, this is just a pet peeve of mine :)

Unless you have two of the exact same computer 1 mining, and 1 not mining - its speculation. Its not a matter of the computer overheating, which none of my computers actually overheat - its a matter of shortening the lifespan of the CPU and GPU. If the lifespan when mining means that you need a new GPU a nd CPU every 2 years, whereas the lifespan not mining means you need it every 3 - although not ovverheating, you are still shortening the lifespan of the processor. Anything that runs hotter than 60 Celsius for extended periods of time risks the lifespan.

I'm not speculating, I'm relating my experience. I'm saying that I have decades old computers that were used in that way (BOINC, seti@home before that, and mining came later of course but the same kind of GPU and CPU usage either way). I've been doing that sort of stuff since 1997 at least. I have yet to kill a GPU or CPU. Not a single one. There's nothing to compare unless one actually dies. How many GPUs and CPUs have you managed to kill with heat? And are you sure that's what died? Based on my experience, it is a rare event (never in my case) that heat kills a CPU or GPU unless there is a cooling failure. Even then, the CPU/GPU is usually fine because the computer will either throttle or shut off. Again, I don't doubt that heat can reduce lifespan but I just don't think it's significant in most cases. The capacitors and/or VRMs on your motherboard or the power supply or the hard drive will die long before a CPU or GPU will.

Just some more anecdotal evidence, I have a GeForce GTX 260 Core 216 that ran flat out either mining or running BOINC projects 24 hours a day, 7 days a week for 7 years. I just upgraded that machine last year. Still have the card though and there's nothing wrong with it. If shortening the life span means that it will only last 10 years instead of 20 then it doesn't really matter. The CPU on this stupid MacBook Pro runs at 97-98 degrees Celsius only using 75% of the cores (which is why it throttles as soon as the GPU kicks in as I believe the max safe temp is 100C) but it's still running the same as it did four years ago. It's on 24 hours/day most days. That Pentium III Dell Laptop I mentioned was used for VOIP and was on for a year straight (minus the occasional reboot - and this was about a year ago when it was already ancient). In the background BOINC was running and the CPU usage was 100% all the time. That laptop still works fine though it hasn't seen much use since then.

I agree that it doesn't hurt to add additional cooling and that it will probably decrease your chances of hardware failure (though i believe those failures aren't likely to be the CPU or GPU). However, I stand by my belief that laptops should be built with cooling adequate to run flat out indefinitely without throttling or failing in an unreasonably short time (i.e. before the hardware is obsolete anyway).

Computers are designed to compute after all (at least they are supposed to be). As a bit of trivia, before the Pentium generation, there wasn't much in the way of power savings features in CPUs. A 486 ran at a 100% all of the time even if it was executing no-ops so you didn't save energy or heat even if the CPU was "idle". I believe this changed with the first Pentiums but perhaps it was later. Of course a 486 used far less energy than a modern CPU anyway...

I won't continue to argue with you on this after this comment, but every study ever done, including the laws of thermodynamics says you are wrong.

As heat increases electric resistance it increases wear and tear on the CPU and GPU and further causes degradation of the CPU and GPU meaning they will require more voltage over time to run at the same frequency. This is not simply an assumption, it is a fact - a cooling fan will increase the lifespan of a CPU.

Here is a great post about the subject. http://www.overclockers.com/forums/showthread.php/723980-Truth-about-CPU-degradation

I don't know what it is exactly you think I am wrong about. My experience is my experience, I'm not wrong about it.

That chart has no specific temps associated with it and is using a completely made up timeline (it even says so at the bottom). I'll take my real world experience over a made up timeline.

Like I said, I've been using some chips for 10 years or more and using them hard and they still work fine. All modern CPUs have cooling fans or they would burn up almost immediately. The point was whether or not laptops have SUFFICIENT cooling. I simply said that if they don't I consider them defective. Based on my experience and that chart I must conclude that every laptop I've ever used has sufficient cooling to live a long life without adding additional cooling. Even my Macbook Pro which runs 24x7 most days at 97+ degrees Celsius for the last 4+ years hasn't degraded yet (though it's definitely not sufficiently cooled or it wouldn't throttle with GPU usage - and because of the way it is cooled, those cooling pads don't really help anyway). Again, I'm not arguing that heat doesn't matter, I'm just saying notebooks should have sufficient cooling already or they are defective. Having to add additional fans to your laptop just because you are using the CPU for long periods of time is like having to add additional cooling fans to the radiator of your car just because you are going on a long trip. I wouldn't by such a car or such a laptop.

I make sure desktop systems I build have sufficient cooling to actually be used. I expect laptop manufacturers to do the same but maybe you think I'm setting the bar too high?

Yes, heat can and will reduce lifespan of chips but the reality is that they will still last many years even running hot as long as you don't push them past their maximum safe operating temperature. If it dies in 10 years instead of 20 years then I really don't care.

I'm not disagreeing with you about heat shortening the life of chips. I'm disagreeing with you about whether laptops should be sufficiently cooled as designed (I believe a decent laptop SHOULD be) and perhaps the timeline in which you believe chips will fail when subjected to higher than average temps. The oldest CPU I have in operation is a Pentium II. The oldest laptop CPUs I have are a Pentium III and PowerPC G3. I haven't even managed to kill those yet. Granted, they see little use now but they ran hard for many years and still get fired up occasionally.

Here's a better (though old) article on the subject: http://www.overclockers.com/overclockings-impact-on-cpu-life/

One of the takeaways is that thermal cycles have a far greater impact than just running hot.

Coin Marketplace

STEEM 0.18
TRX 0.24
JST 0.036
BTC 93711.32
ETH 3230.63
USDT 1.00
SBD 3.01