Visual Studio 2008 Released to MSDN Subscribers

1 minute read

While Visual Studio was originally planned to be released in February of 2008, November 19th 2007 seemed like a better release date for Microsoft. MSDN Subscribers can already download the Retail versions of Visual Studio 2008 Professional, Standard and the Visual Studio Express editions through their Top Subscriber Downloads screen when logged on to MSDN.

At the moment downloads seem very slow (probably due to high network traffic) and connections might fail. Remember: because Microsoft is offering this through the Top Subscriber Downloads box, the downloads will open in an Akamai window instead of the regular Transfer Manager offered by Microsoft; this causes problems.

Click here to log into your MSDN subscriber Account

Coming Soon to MSDN Subscriptions – Visual Studio 2008 The next version of Visual Studio, Microsoft Visual Studio 2008, will provide an industry-leading developer experience for Windows Vista, the 2007 Microsoft Office system, and the Web. We expect to have Visual Studio 2008 editions available on MSDN Subscriber Downloads shortly after release. For a faster, more reliable download experience, please utilize “Top Downloads” below. All English Visual Studio 2008 editions will be available here first. Visual Studio 2008 editions also will be released—on a staggered schedule—to Subscriber Downloads. To find out more about the new versions, see the Visual Studio Developer Center. All English Visual Studio 2008 Editions will be available from “Top Downloads” below. Please utilize “Top Downloads” for a faster, more reliable download experience.

MSDN Blog post

OpenGL 3.0: Finally some news!

2 minute read

After waiting for a long time an update has been issued about the OpenGL 3.0 specification. While this is an update, there is still no specification but the promise that the OpenGL ARB is working hard on the spec is below.

You understandably want to know where the OpenGL 3 specification is. I have good news and some bad news. First the bad news. Obviously, the specification isn’t out yet. The OpenGL ARB found, after careful review, some unresolved issues that we want addressed before we feel comfortable releasing a specification. The good news is that we’ve greatly improved the specification since Siggraph 2007, added some functionality, and flushed out a whole lot of details. None of these issues we found are of earth-shocking nature, but we did want them discussed and resolved to make absolutely sure we are on the right path, and we are. Rest assured we are working as hard as we can on getting the specification done. The ARB meets 5 times a week, and has done so for the last two months, to get this out to you as soon as possible. Getting a solid specification put together will also help us with the follow-ons to OpenGL 3: OpenGL Longs Peak Reloaded and Mount Evans. We don’t want to spend time fixing mistakes made in haste.

Here’s a list of OpenGL 3 features and changes that we decided on since Siggraph 2007:

  • State objects can be partially mutable, depending on the type of the state object. These state objects can still be shared across contexts. This helps in reducing the number of state objects needed in order to control your rendering pipeline. For example, the alpha test reference value is a candidate to be mutable.
  • We set a minimum bar required for texturing and rendering. This includes:
    • 16 bit floating point support is now a requirement for textures and renderbuffers. Supporting texture filtering and blending is still option for these formats.
    • S3TC is a required texture compression format
    • Interleaved depth/stencil is a required format for FBO rendering
    • At least one GL3-capable visual or pixel format must be exported which supports front-buffered rendering.
  • OpenGL 3 will not have support for the GL_DOUBLE token. This means it will not be possible to send double precision vertex data to OpenGL.
  • A format object has to be specified per texture attachment when a Program Environment Object is created. This helps minimize the shader re-compiles the driver might have to do when it discovers that the combination of shader and texture formats isn’t natively supported by the hardware.
  • GL 3 will only cache one error, and that is the oldest error that occurred.
  • The OpenGL pipeline will be in a valid state once a context is created. Various default objects, created as part of the context creation, will have reasonable default values. These values are such that a simple polygon will be drawn into the window system provided drawable without having to provide a Vertex array object, vertex shader or fragment shader.
  • GLSL related changes:
    • GLSL 1.30 will support a #include mechanism. The actual shader source for the #include is stored in a new type of object, A “Text Buffer” object. A text buffer object also has a name property, which matches the string name specified in a #include directive.
    • Legacy gl_* GLSL state variables are accessible through a common block.

More details will follow soon in an upcoming OpenGL Pipeline newsletter.

Barthold Lichtenbelt
OpenGL ARB Working Group chair

A retrospective view of a 1996 Steve Jobs interview

3 minute read

I know it’s easy to sit back and criticize what someone said 11 years ago. That’s why I won’t criticize Steve Jobs for what he said, rather, I’d like to show the difference of what the prediction was and what the reality is at the moment - regardless of who said it.

On a side note: It doesn’t matter if you like Apple, Steve Jobs or any of the associated parties, Jobs is an incredible innovator and one of the great digital-minds of the 20th (and turns out 21st) century. In all honesty, I personally do not “like” Apple’s machines and Operating System but that doesn’t mean that I don’t like the innovation that they have brought about in the field of computing and computer software - it is simply mind boggling when you research it.

This blog post (remember that it’s an informal blog post) is a personal view of an interview that Steve Jobs gave to Wired Magazine in 1996 and was published online at Aether.com. Interestingly enough, 1996 was the year that Apple bough NeXT (Jobs’ company at the time) and informally hired Jobs back. This interview precedes those events. I’ll only pick things that are somewhat relevant to this site or that I find interesting; this is a blog, you know?

The desktop computer industry is dead

This will be the first thing that’ll pop out of the page. While at first this might sound like a ridiculous claim, you must place it in context with the entire section. For someone who was at the birth of Desktop computing, the 90’s might have seemed extremely dull in comparison to the 70’s and 80’s when many new things started to come into existence (The PC, CD-ROM, color displays, GUI, the Mouse, HDDs became cheaper, etc.).

When I went to Xerox PARC in 1979, I saw a very rudimentary graphical user interface. It wasn’t complete. It wasn’t quite right. But within 10 minutes, it was obvious that every computer in the world would work this way someday.

While this view was shared by competitors it has proven to be about 99% right. Almost every computer in the world works with some sore of user interface albeit underdeveloped. With the exception of pure Unix-based machines and low-level terminals, I can’t think of any other Operating System that doesn’t have an integrated GUI.

Objects are just going to be the way all software is going to be written in five years or - pick a time. It’s so compelling.

It took about five years for software developers to completely grasp Object Oriented Programming and to apply it correctly so in this regards Jobs was right. OOP has become the norm. While there are still many programmers out there that write procedural code, OOP has largely taken over — and that’s a good thing in most cases.

The Web is not going to change the world, certainly not in the next 10 years.

Jobs was a bit off here. While the dot-com boom didn’t change the entire world, it did change the way that we use and look at the web in most parts of the worlds. I can’t think of many countries that don’t have internet access or people that have never heard of the internet besides underdeveloped or oppressed civilizations.

I don’t see most people using the Web to get more information.

While I do see Jobs’ point (he mentions information-overload and the ability to process information) we only tend to gather the information that we really care about and massive amounts of it. For example, Wikipedia is one of the most active websites on the internet used daily by a massive amount of people.

As to what’s now known as e-Commerce:

I think we’re still two years away.

A bit too optimistic but pretty accurate. Again, the dot-com boom pretty much exploded e-commerce into our lives which happened around 2000-2001.

End thought: The thing that struck me most in this interview was the amount of expectancy in regards to technology that Jobs has. It’s something to think about. “Am I as an end-user the catalyst of innovation?” or “If I expect more, will I get more?” It should be that way but it often isn’t. Companies don’t really seem to listen that much to their customers with the rogue exception here and there. In the end Jobs seems to be pretty accurate with his predictions of the future I’d like to say more so than another software icon but then I’d just be trolling around.

Jobs on the topic of superior European washing machines:

I got more thrill out of them than I have out of any piece of high tech in years.

Re: Microsoft Open Source

1 minute read

Matthew Mullenweg at Photomatt.net “predicts” that:

Microsoft will Open Source Windows before 2017.

While I usually like Matt’s posts — I get them through the Wordpress control panel — I think that he might be a bit off this time. The Windows kernel is still under constant development and was derived from the DOS base architecture first founded in 1981.

This means that DOS is now 26 years old, almost 27.

Since MS-DOS hasn’t been released under an Open Source license I think it’s ridiculous to say that MS Windows will be releasing it under an O.S. license any time in the distant future — even with Open Source getting more and more popular.

An older product, BASIC which was first released on the Altair in 1975 (32 years old) is no longer being developed, is no longer being shipped with any Microsoft Operating Systems but has evolved in Microsoft’s Visual Basic / Visual Basic .NET. Products which in their own right have helped form products like C# and the Visual Studio product line. While you could get the BASIC source “code” (in patch point form), later binary versions of BASIC aren’t open and haven’t been opened up ever since.

While Microsoft gives you many tools to build upon the Windows Platform, I don’t think that Microsoft will open the source to Windows, not even Windows 1.0. I might be wrong — and I hope I am — but I got a pretty good feeling that current Microsoft CEO Steve Ballmer is more focussed on monetary rewards for him and Microsoft than anything else.

Source / Reference: Microsoft Open Source

Atom: a gem in the making

1 minute read

Timothy Farrar over at FarrarFocus.com is creating a unique game called Atom. The reason why I call it unique is that unlike many games out there the content for Atom is 100% dynamic which means that (for example) everything can be set in motion. Here’s a quote from Timothy’s post on GameDev.net:

Atom started with the idea to go back to PC gaming’s roots (low risk investment, experimenting with technology, fun timeless gameplay, taking a wild idea from concept to market), while taking advantage of the power of modern hardware.

[…]

The graphics engine is completely unconventional, 100% dynamic (no static geometry, everything moves), and based on an animated solid hierarchical cellular representation with an “animation bone” for each cell which is linked into a physics/CFD engine which gives life to the world. Rendering is done via a special purpose painters order micro-impostor compositing engine (old-school, not based on polygons!) which also provides realistic motion blur. Content creation is done with a mix of hand controlled procedural generation.

Due to Atom’s unique world representation, you can literally zoom into the molecular structure of anything, even on the inside. This also works in reverse, Atom is able to simplify any structure, and thus has infinite level of detail control. A custom visible surface determination algorithm eliminates overdraw allowing for both wide and telephoto views inside and side any structure no matter how sparse or dense the geometry.

Source: GameDev.net

The concept seems very promising, I suggest watching this project grow. Another thing worth mentioning is that the API used to generate the images is OpenGL, not DirectX. Timothy has posted on his blog that SM 4.0 will might be added to the engine. This is good news for XP users since there will be no need to upgrade to Vista if you want to try out this game.

Check out the Atom project here and watch the videos.

The State of DirectX 10

less than 1 minute read

Quite recently I released a small article discussing if you (as a developer) should upgrade to Vista to take advantage of DirectX 10 functionality only available in Windows Vista. Quite simply the answer was “yes” in my article. As a developer you should always take advantage of new technologies, specially if that technology will replace an existing technology.

HotHardware.com has released an article more aimed towards the end-user than the developer which discusses the State of DirectX 10. In this article both performance and image quality are evaluated to give a fair estimation.

It’s 13 pages long but worth the read, check it out at HotHardware.com.

id Software: John Carmack’s Response to OpenGL issue

less than 1 minute read

Recently I’ve created a post about how id Software will no longer use OpenGL as their primary graphics API for game development. Here’s John Carmack’s response to the rumors:

There is certainly no plans for a commercially supported linux version of Rage, but there will very likely be a linux executable made available. It isn’t running at the moment, but we have had it compiled in the past. Running on additional platforms usually provides some code quality advantages, and it really only takes one interested programmer to make it happen.

The PC version is still OpenGL, but it is possible that could change before release. The actual API code is not very large, and the vertex / fragment code can be easily translated between cg/hlsl/glsl as necessary. I am going to at least consider OpenGL 3.0 as a target, if Nvidia, ATI, and Intel all have decent support. There really won’t be any performance difference between GL 2.0 / GL 3.0 / D3D, so the api decision will be based on secondary factors, of which inertia is one.

John Carmack

This was posted as a comment on Slashdot.

idTech 5, Rage, and more in 3 Videos

1 minute read

Somehow I keep getting back on this topic. Maybe it’s because of the excitement that was involved with the features promised by the idTech 5 engine as presented in the earlier QuakeCon demo.

Today was one of those exciting days for me. I just finished watching a two part video commentary on the development aspect of idTech 5 that was posted on GameSpy. A third video is an interview with John Carmack on the changes that will be occurring within id Software and other issues. The two commentary videos feature commentaries by John Carmack and Matt Hooper from id Software.

Keep in mind that the “lag” that you might notice is because you’re viewing a technical demo of an unfinished product. Another thing that might be worthy to notice is that the OpenGL API is no longer being used as primary graphics API, yet the company is moving towards DirectX as their primary API.

The first video is presented by John Carmack and is about the development of idTech 5 and Rage. The backdrop for this video is pretty much continuous, so you might want to look away to avoid insanity. He speaks about graphics development, Enemy Territory: Quake Wars and other things that might be interesting to developers.

The second video introduces Matt Hooper as a secondary host and talks more about the tools and assets given to the level designers and artists in idTech 5. For an unreleased product, it looks amazing. There seems to be a great (if not massive) improvement over older tools such as GtkRadiant by having a more intuitive user interface. But you can judge for yourself below by watching the videos attached to this post.

The third video announces a new “game” or adaptation of the existing Quake 3 code base and create a free to download online game, hear all about it in the third video.

Edit: as of 2023, it seems that these videos have been lost to the ravages of time.

Way Off Topic: Internet Taxes, November 1st 2007

less than 1 minute read

If you’re an American citizen or resident and you’re not aware of this issue, keep reading. Otherwise, take action. Links follow.

The Internet is the “Information Highway.” It provides unlimited information, a myriad of free resources and some commercial resources. Due to an expiration of the “Internet Tax Freedom Act of 1998” we might be hitting some toll-plazas along our routes. If you do not feel comfortable paying taxes for free information, sign up at DontTaxOurWeb.org to take some action. If you want to do more, write your senator, but do it promptly. We’re almost at November 1st.

This could potentially be doomsday for many of us. The main thing that the tax would cause is a limit on information which would hurt everyone. Since a large percentage of Americans use the internet, it is imperative that we keep an international resource out of local hands.

At this moment there are many opposed to internet taxes in the senate but it will not hurt to petition and write your congressman or congresswoman. This needs to be fought, freedom of information is at stake.

id Software: bye OpenGL, bye Linux

1 minute read

John Carmack and the id Software company have always been great supporters of the OpenGL Graphics API. Alas, major development with the API stops here for the game developer.

id Software has been working on a new game engine that would “revolutionize” the gaming industry and provide many advanced features. One of its main features is that it can support a constant 60fps on console systems. It was also announced that their new engine will run on: PC, Mac, Xbox 360 and PlayStation 3. Eh, are we missing something important here?

Linux. Linux has always been a supported system when it came down to games from id Software. Yet according to Todd Hollenshead (id Software), id’s upcoming game “Rage” (which will be using idTech5) will be primarily a DirectX 9 game, not DirectX 10, nor OpenGL. Yet OpenGL will be used for the Mac release of both the engine and the game.

This could mean that OpenGL might become very unattractive in the game development community since there will be one less patron to support its Open cause.

One benefit from all this is that the game will be able to run on the Windows XP platform and not solely on Vista as all DirectX 10 games require. This, of course, is no consolidation for the Linux people who will now have to run the game on Windows.

DirectX 10: Is it worth upgrading to Vista for?

2 minute read

As a developer (and gamer) you always want the latest gadgets and DirectX 10 seems to be one of these gadgets that you just “need to have”.

But does it justify spending much money for an upgrade?

What does DirectX 10 have that DirectX 9 doesn’t? According to some technology demos - not so much. Ok, some special effects but what about the rest? A game isn’t made out of SFX (although it seems to become a trend).

It turns out that DirectX 10 isn’t only about special effects but defines a new style of next generation technology. Video Card manufacturers are forced to use optimized pathways to support the new industry standard resulting in very fast hardware such as the Geforce 8 series (8800 GTX, 8800 GTS).

One of these features is WDDM which stands for Windows Display Driver Model. WDDM is basically a resource manager for graphics processes. One example of what WDDM does is this: In XP when you switch from one Direct3D application you will receive a DEVICE_LOST exception, which basically means that you can’t run two processes from one GPU. Your application crashes and you’ll have to write X amount of handlers dealing with the exception. According to Microsoft this is now a thing of the past.

Since the Vista desktop is a 3D environment you’d loose your application everytime you minimize. This has now been eliminated. Each GPU process is its own thread (just like in regular programming) meaning that you can have X amount of 3D processes running without the need for special handlers. Also in the same category is improved crash handling that comes with DirectX 10.

Ok, so that’s neat but I’ll need more to be convinced to switch to DirectX 10 hardware and Windows Vista.

Another feature of WDDM is that if you run out of video memory WDDM can virtualize your System’s memory for video processes. Which - in theory - sounds very cool, but I don’t know if this would cause slowdowns.

Here’s a list of new features

  • Shader Model 4
  • Texture arrays
  • DirectSound is gone, XACT is its replacement
  • Less load on the CPU - GPU tasks really get processed by the GPU this time
  • Unified Pipeline Architecture - the Programmable Graphics pipeline (SM 4.0)
  • No object limit - There is no software limit to how many objects you can add to your scene. The only factor in this is your graphics hardware
  • Geometry Shaders
  • Instancing 2.0 - An optimized version of the Instancing technology found in Geforce 6 series up and Radeon 9500 and up

So as a developer is it worth it to upgrade to Vista and DirectX 10 hardware? In my opinion, yes. This is simply the new generation of computer graphics, just because you have to upgrade doesn’t mean is’t evil.

Remember when you upgraded from your TNT 2 card to a Geforce? Same thing. Yet this time the improvements revolve more around the pipeline than the actual quality of the image.

Programmer’s Block

less than 1 minute read

I guess writers have this, but now I have it with coming up with new programs. I just can’t think of anything. I’ve been wasting my time reading many programming books on Agile, Design Concepts etc. but that didn’t give me any ideas either. Some knowledge, yes, but inspiration, no.

Bah.

Google Music Trends

less than 1 minute read

Google Music Trends

This is probably one of the coolest ideas that I’ve seen on the Internet in a loooong long time. Google Music Trends is a small program that you download that interacts with your music player and uploads statistics to Google. At the moment the most listened to song is “In the end” by Linkin Park. Of course this has to change ;)

Again, participate!

ATi too slow or NVIDIA too fast?

1 minute read

DirectX

In November of last year, NVIDIA released its latest flagship graphics card chipset, the 8800 / G80. This chipset featured the first full featured DirectX 10 support ever conceived and was (still is at the moment) the only DirectX 10 card on the market.

So when will the other giant, ATi, catch up? According to rumor, ATi is planning to release its latest chipset (R600) in the first quarter of 2007, more precisely an early March launch.

But did ATi push its launch too far back or did NVIDIA jump the bullet on this one? Some say that ATi is holding back for a reason so they can improve their technology and beat NVIDIA performance wise. Others say that ATi didn’t expect NVIDIA to launch this early and is trying to throw together a product too fast. The latter, I personally don’t believe.

ATi has been a major player in graphics technology for a while, so it would be a strange decision to underestimate DirectX 10 and continue focusing on their X1900 technology. I think we can safely assume that ATi is preparing its product to perform to the fullest and maybe be waiting for better software support in Vista - something that NVIDIA has been lacking.

Also, what good is a DirectX 10 card if there are no DirectX 10 games out yet? Again, some say that the 8800 is simply “the best DirectX 9 card out there that happens to have DirectX 10 support.” Yet according to hardware tests on TomsHardware.com, the ATi X1950 XTX stands above the card with full head and shoulders.

So what can we expect in the near future from these two graphics giants? In my opinion (Nostradamus style), ATi will launch the R600 chipset and will make the G80 look like something from 1999 performance wise. The 8800 GTX will get a successor soon enough that will compete with the R600 and I think that ATi will get you more value for the money, as is happening with all AMD and ATi products lately (same company).

I was eagerly looking into buying a G80 or 8800 but I have a feeling I’ll be regretting it within the span of one month. After all, the G80 is already 4 months “old”.