In building a PC mostly as a workstation, my choice of a GPU came as more of a secondary choice. My thought process was: sure, I’ll play some games every once in a while, but really, this is my work machine. It needs to do workstation stuff and not much more.
If you’ve been following along, you know that I’ve been trying to use Linux for my local development purposes on an Intel NUC (as well as a Lenovo laptop). I have to say, I still love the NUC and am going to find another use for it somehow, I’m just not sure what yet. However, Linux was somewhat of a disaster for me.
For a while now, my 2011 13” i5 MacBook Air which I use for development purposes has been showing signs of pending death: flickering screen when moving the lid, excessive heat, inability to be charged, killing 2 replacement batteries and 3 chargers, slowing down with new OS X/macOS installs, etc. etc.
You’ve heard of Intel’s recent “Cougar Point” chipset screwup causing all motherboards for Sandy Bridge-based Core i7 processors to be pulled from the market until next month when new parts will appear. We’ve also all read the reviews that show the new i7 processors to be very fast and beating AMD’s current lineup in terms of performance. So where does AMD stand?
Sandy Bridge is Intel’s new microarchitecture, slated for release in January of 2011. Sandy Bridge processors will effectively phase-out the current lineup of processors, based on the LGA 1366 socket, with processors fitting a brand new socket, LGA 2011.
While I understand the need for technology to move forward, and am an avid user of Intel CPUs myself, I can’t help but feel underwhelmed by the reviews, especially on the IGP (integrated graphics processor) front, the part that concerns me greatly.
The tech that Intel chose to use on the die is the HD 3000, and it’s little brother the HD 2000. Let’s hope it’s nothing like the GMA (Graphics Media Accelerator) X3000 and family, the scourge of the entire IGP world. Please, Intel, please.
Note: the continued body of this blog post was not archived, and is thus lost to the ages.
Intel demoed Larrabee for the first time to the public at the IDF (Intel Developer Forum), according to PC Pro.
The attached screenshot is a bit underwhelming but maybe we’ll see some impressive examples soon. In any case, if the demo is at the IDF now, the public release couldn’t be far off.
Edit (2010)
In case you haven’t heard — it seems like Larrabee was cancelled for good.
Over the last week I’ve been annoyed with the capabilities of the Intel 945GM chipset. While this chipset is not widely supported in graphics development, it’s a common chipset that Intel has to offer that comes standard with a bunch of PCs. So I’ve unhooked my Graphics card and attempted to run some of my code through this GPU.
Direct3D seems to work fine, pretty fast too. Pixel Shader version 2.0 seems to be supported through hardware, and Pixel Shader 3.0 through a software device. Now, you’d expect that the OpenGL implementation would have the same capabilities, or at least Pixel Shader 2.0 through hardware.
Turns out that the OpenGL version on this device is OpenGL 1.4 with to my knowledge no support for fragment shaders (GL_FRAGMENT_PROGRAM_ARB). I wonder why this is. The hardware capability is available, why not make use of it?
Valve’s hardware survey at the time of this post still reports that 22,183 people out there have the Intel (ialmrnt5.dll) driver. If so many gamers are using Intel’s chipsets, you can expect the regular end user base to be much larger. Why not give them some updates? Please? :o)