Overclocking? Not Any More.
Bill Quick

Intel returns to its roots with slew of overclocker-friendly desktop CPUs | Ars Technica

For the last few years, Intel has focused primarily on its mobile CPUs—its chips for laptops, Ultrabooks, tablets, and phones have generated more attention than their high-end desktop lineup. But yesterday Intel threw a bone to the desktop-building, CPU-overclocking desktop set in the form of a few new high-end chips that will go on sale later this year.

Back in the day, when “laptops” weighed 12 pounds and cost three grand, I built my own desktop machines and overclocked the crap out of everything.  Given that it might take an hour to download a one meg file from the net, you did everything you could to speed up everything else.   And even with all that, you spent way more time waiting…and waiting…and waiting,.. for something simple to load.

I don’t bother anymore.  In fact, I don’t even have a desktop any more.  I can see why gamers and a few other high intensity users would do so, but I just can’t see this as a significant market.

In fact, nowadays is seems something like building your own microwave or refrigerator.  It’s possible, but why bother?

Posted in Technology permalink
Bill Quick

About Bill Quick

I am a small-l libertarian. My primary concern is to increase individual liberty as much as possible in the face of statist efforts to restrict it from both the right and the left. If I had to sum up my beliefs as concisely as possible, I would say, "Stay out of my wallet and my bedroom," "your liberty stops at my nose," and "don't tread on me." I will believe that things are taking a turn for the better in America when married gays are able to, and do, maintain large arsenals of automatic weapons, and tax collectors are, and do, not.


Overclocking? Not Any More. — 12 Comments

    • For example, look at cases. Lian Li makes some beautiful stuff in brushed aluminum that is going to cost you extra–but nothing you’ll be able to get from a prebuilt system from a major vender will ever look as good.

  1. I find a family desktop with 5 accounts to be an elegant solution for my family. Sure, we’ve got tablets and other machines, but for gaming? The desktop is the best.

    Oh, and as far as getting the exact system I want? I see that as a feature. Also, having built ALL of the PCs in my house, I know how to fix them when something goes wrong. I consider it keeping skills handy for when the SHTF.

    • Note that I did mention gamers as a market for OCing. That said, most of the rest are custom hobbyists like Rick. Nothing wrong with that, but most people aren’t interested. Including, now, me.

      I couldn’t even tell you who makes good mobos these days – and once upon a time I could tell you the characteristics of the top twenty off the top of my head.

      • I guess I’m in the minority, but I enjoy doing it. It’s fun to work on the guts of a PC. Kind of like a silicone surgeon. And I keep thinking that such skills might prove useful when we rebuild after the burning times come and go. Because I no longer think we can avoid them.

        • Toward the end of my overclocking days, PG, I kept on doing it for a while just because I enjoyed it, the same way I used to enjoy completely taking apart the enging on my Triumph Spitfire. There’s something calming about messing around with a power screwdriver and a case full of electronic stuff.

      • No doubt, Bill (and PG.) My wife, for example, would probably be just fine with a 10″ Android tablet, say, or the Dell Venue Pro (although she’d want the bigger screen than the 8″ version.)

        Gamers, hobbyists, developers. People with a serious interest in video editing (or presumably photo editing.) None of these people are mainstream users. (BTW I fit into the “gamers”, “hobbyists” and “developers” category. The former is why I wanted a good video card, the latter a fast OCed CPU.)

        • I’d expand your list to gamers, hobbyists, and content creators. If you’re writing words, or making music on the computer, or editing video, or making software, or analyzing data sets, you need a real computer. Most people — content consumers — are fine with an iPad.

            • You used a laptop for much of it, right? Or the T100 convertible. How much of Lightning Fall was written on a straight tablet with external keyboard? The major distinction I was making above (which is not the same as the roll-yer-own vs commodity purchase distinction of the main post) is between “real” computers and one-thing-at-a-time tablets.

              I know what you’re getting at, I think: tablets are powerful enough for content creation. I don’t disagree, for some things. Not for others. In some cases it’s the sheer CPU power, RAM, or storage that falls short. (I’m thinking of the genetic programming runs I used to make or some of the engineering analyses or simulations.) In others, it’s because tablets usually limit users to one thing at a time, and I don’t know about you, but when I write I have a handful of browser windows open for quick reference while my text editor is open, preferably on another monitor.

              (Unless I’m missing not only your point but my own. I was interrupted repeatedly in the course of writing those two paragraphs and have no friggin clue what I was trying to say. Anyone with a lick of common sense would just delete this comment rather than post it, but, well, I’ve gotten married twice…)

              • That’s not what I’m getting at.

                Desktops are a distinct class of computer: Basically they are big things that sit on a desktop and don’t get moved. They are also large enough that the cases allow room for the average hobbyist to work inside them, add and change components, and so on.

                None of this is true for Notebooks or Hybrids, although my Asus T100 is way, way, way faster and more powerful than the last desktop I owned – and retired more than eight years ago.

                And this is all a moving target, anyway. Five years ago, the most intrepid overclocker could barely reach the speed and power of a top-end tablet of today. Yet those desktops were considered real “pro” computers, while more powerful machines in different form factors today are considered toys.

                I don’t get it.