My History of Visual Studio (Part 1)
[All the other Parts: History of Visual Studio]
I wrote in the teaser that there is no one “History of Visual Studio”, there are as many histories as there were people involved. If I may repurpose a famous quote, “There are eight million stories in the Naked City...” This is one of them.
Visual Studio’s history probably begins in 1975 with Bill and Paul’s decision to create Altair BASIC. Maybe you could start before that but I think that’s as far back as you could go and still say it’s about Visual Studio at all – it’s at that time that Microsoft decided it was going to have software development products and not, say, toaster automation.
Old timers like me used MS-BASIC on many different classic microcomputers, for me it was mostly the Commodore PET (I started on the PET 2001 model, whose name is especially ironic considering how unrepresentative it is of a 2001 era computer but I digress). You could have used MS BASIC on an Apple, or a TRS-80. Many of us cut our teeth on those computers, and the idea of rich immediate feedback, a sort of personal programming experience was branded on us.
Those of us that really wanted to know what made a computer tick (literally) would spend countless hours tearing apart the software that made them run; I’m fond of saying that Bill Gates taught me 6502 assembly language by proxy and so was training me for my job at Microsoft years later.
The 80s were a time of genetic diversity in computers, and so in programming tools as well. Lots of people were making programming languages and I’m going to avoid dropping names to stay out of trouble with trademark holders but I think you can remember many different first class systems for the computers of the time. I can think of at least 5 off the top of my head.
If you were creating tools for the PC, which was starting to be the dominant machine by the middle of the 80s, your life was especially difficult. The x86 instruction set wasn’t terribly fun; the machine architecture with its 64k memory segments was enough to give you a migraine. These challenges, and the need to milk every bit of performance out of processors, resulted in very bizarre PC-specific language constructs.
It’s 1988 and I landed at Microsoft fresh out of college, I had my own PASCAL compiler under my belt (built with 4 friends in school), and an assembler and linker to boot. But I was not ready for:
char _near * far pascal GetString(char far * far * lplpch);
For starters, what the heck was a Pascal keyword doing in my C programming language? The only thing Visual about this thing was that the nears and fars were making me wish for bifocals (recently granted, and I was better off without them).
I need to rewind a bit.
The compiler that I was using to do my job in 1988 was Microsoft C 5.1 – a great little language and maybe one of our more successful releases. Microsoft had enjoyed considerable success in the languages space leading up to that time but in recent years, and for some time to come, a certain company, whose name starts with a B, was eating our lunch. Great tools, great prices, it was very motivational in Buildings 3 & 4 on the Redmond campus.
So the C product was already on its 5th major release. The basic compiler “BC6” had just shipped and Quick Basic “QB4” was going out the door.
You may have noticed I’m not mentioning C++ yet. We’re still a goodly distance from that part of the story.
So where am I? Ah yes, 1988. The project I’d been hired to work on was cancelled after a few months (I expect I’m in good company on that score), that project by the way was a cute variant on the C language designed for incremental compilation – it was called, cough, C#. Strangely, through the lens of 2009, it looks remarkably like what you would get if you tried to make C.Net.
The mainstream project at that time was C6.0, it featured a bunch of cool things, new compiler optimizations, better debugging, and more. Its primary target was an operating system called OS/2 – you may have heard of it – but it also had to run well on DOS. A tall order that.
I was working on the programming environment, in particular on the source browser, largely because I had worked on the source browser for the C# system and people liked it and many of its ideas had found their way into the C6 product already. I suppose nobody will be surprised that one of the first things I had to do was improve the performance of the thing.
Anyway the programming environment, arguably the first IDE we ever shipped, was called “PWB” or programmer’s workbench. It was all character mode graphics but it used that funky Character Windows thing that was prevalent in the flagship applications Microsoft was shipping at the time. CW, or just COW as we lovingly called it, was a delightful environment that provided all kinds of automatic code swapping (but not data) to allow you to have more than 640k of total code while still running on a regular PC. Its swapping system bears a lot of resemblance to what was in the real Windows of that era (2.4.x I think it was).
Now the thing about having only 640k of memory and trying to have an IDE is that you can’t actually expect the thing to be loaded and running while you’re doing something like a build, or debugging, or basically anything other than editing really because you simply don’t have the memory. So this beauty used some very slick tricks – like for instance to do a build it would first figure out what build steps were needed, write them to a file, then exit, leaving only a small stub to execute those steps, run the steps, and then as the last step restore itself to exactly where it had been when it exited having created the illusion that it was resident the whole time, which it was not.
Debugging used even more sleight of hand.
I think the Codeview debugger may be one of the first, and possibly the most important DOS Extended applications ever written (because of influence). You see the debugger is in a difficult position because it needs to have symbols and other supporting data active at the same time as your program is running, and it doesn’t want to disturb that program very much if it can avoid it. This is quite a challenge given that memory is as tight as it is – but there was a bit of an escape clause. Even in say 1989 you could use features on your 386 processor (if you had one) to get access to memory above the 640k mark. These kinds of shenanigans were commonly called “using a DOS extender” and I think the Codeview debugger probably had one of the first ever written, and I think a bunch of that code later inspired (who knows) the extension code in another product you may be familiar with – that would be Windows 3.0. But that is really another story.
All right so optionally dos extended character mode debugger with character mode editor and build system that makes the product exit to do anything. Get all the bugs out of it and presto you have the first MS IDE.
Lots of languages built on PWB, it was designed to support a variety of them. I know the Browser formats supported Basic, Pascal, FORTRAN, COBOL, and C style symbols out of the box. Most of those actually saw the light of day at one time or another.
That was spring of 1990 and that was C6.0. That IDE was the basis for the compiled languages for some time.
However, things were not standing still.
C++ was taking the world by storm, and having a high quality optimizing C compiler was old news, but lucky for us we had not been idle during this time. While some of us had been busy getting C6.0 ready others had been busily working on a C++ front end for our compilation system. I think mostly everyone was sure we could finish up that C++ compiler in no time at all (I say mostly because there were people who knew better).
Wow, were we wrong. I mean, seriously, majorly, what-were-we-THINKING wrong.
It turns out C++ is a hard language to compile; heck it’s a hard language to even understand. I remember this one particular conversation about how tricky pointers-to-members are with eye-popping results when it was pointed out that one of those things could point to a member defined in a virtual base… C++ is like that, a lot of things seem easy until you combine them with other things and then they get hard.
Meanwhile we were struggling to get the needed features into the debugger stack, it turns out that creating a C++ expression evaluator is no easy feat either. Expression evaluators are lovely things – they are frequently called upon to evaluate expressions that would be illegal in any actual compilation context (e.g. find me the value of a static variable that is currently out of scope, or the value of a global defined in another module). Expression evaluators have to do all these things while still retaining the feel of the language and remaining responsive.
Did I mention all this could be slow?
I was working on a new development environment, targeting the new 3.0 release of Windows – another project that never saw the light of day – but we were having fits trying to get windows.h to compile in anything like a decent amount of time.
That’s when the precompiled headers miracle happened.
I call it that because the .pch literally revolutionized the way we built things with our tools. Other system had used similar systems in the past but ours had some very clever notions. The most important of which was that since it was snapshot based it guaranteed that the net effect of the headers up to the PCH point was absolutely identical in every compiland. That meant that, for instance, you could exactly share the debugging and browsing information as well as all the compiler internal state. The fact that when you #include something you may or may not get the same net effect in one file as in another is the bane of your existence as a tools person and this was immediate relief! And it was fast!!!
I’m not name-dropping but suffice to say I know the person who did this work very well and I was one of those weekend miracle deals that you read about, it couldn’t be done, can’t be done, oh wait there it is.
Meanwhile, yet another team was working on a little something called Quick C for Windows which turned out to be hugely important. A lot of ground breaking work went into that product, it was the first IDE for windows with real debugging, but I’d have to say it was incomplete and I’ll talk more about why that is hard in just a second.
Meanwhile that other company was not standing still and they were delivering great C++ compilers. It was 1992 before we had anything at all. In those months we delivered C7 on top of PWB again (PWB 2.0) and blink-and-you-missed it, we delivered Quick C for Windows (QCW).
My project was cancelled. Again. It’s a good thing had my fingers in a lot of pots :)
By the way, the C7 product was, by mass, I believe, the heaviest thing we ever shipped. I don’t think we ever tried to deliver that many books ever again.
So we shipped a bookshelf and now things were getting interesting.
We couldn’t ship PWB again, we needed a graphical development environment, our basis for this was going to be QCW and work was already underway to generalize it but oh-my-goodness there was a lot of work there. Also, we were missing some critical C++ language features; the competition was not standing still. We had a very limited class library that shipped with C7, MFC 1.0. We needed an answer there, too. And did I mention that we needed a speed boost?
Any of these bits of work would be daunting, but let me talk about just a few of them. First, debugging.
Debugging 16 bit windows (it would be Win3.1 by the time we were done) is nothing short of a miracle. Win16 is cooperatively scheduled, it has no “threads” per se, there is just one thread of execution. Now think about that, what that means is that you can’t ever actually stop a process, if you do the entire world stops. So if you’re trying to write a GUI debugger actually stopping the debuggee is sort of counterproductive. So, instead, you have to make it LOOK like you stopped the debuggee but actually you didn’t, you let it keep running, only it isn’t running any of the user’s code it is running some fairly (hopefully) innocuous debugger code that keeps it dispatching messages, lets the debugger run and doesn’t actually proceed with the user's whatever-it-was-they-were-doing.
A tiny part of this miracle is that when the debuggee “stops” you have to, on the fly, subclass each and every one of its windows and replace its window proc with something that draws white if asked, queues up important messages to be delivered later, and mostly does a lot of default processing that is hopefully not too awful. That’s quite a trick of course when any running process could be trying to send messages to the thing for say DDE or OLE or something. It’s the miracle of what we call “soft mode debugging” and that’s what we delivered.
Meanwhile, the tools… Well there was this thing called Windows NT going on, maybe you’ve heard of it, we didn’t want to produce all different kinds of binaries for hosting in different environments so we needed to change our dos extension technology to be something that allowed us to run Windows NT character mode binaries on DOS. That’s exciting. And it had to be regular vanilla DOS or DOS as running inside of Windows. Double the excitement. But sure, we did that too (with quite a bit of help from a 3rd party that again I’m not naming because I don’t want to go there).
And the tools were too slow. Yet another effort went into putting codeview information into .pdb files to alleviate the painful de-duplication of debug info that was the cvpack step; those steps were then fused directly into the linker so that we didn’t write out the uncompressed stuff only to read it back in and compress it, so we could write it out again. Add that to some practical compiler changes and we were, for the first time in a very long time, the fastest C++ compiler out there (at least according to our own internal labs, YMMV).
Meanwhile MFC was coming along very nicely thank you very much. And there was a designer and a couple of critical wizards and wow this thing was starting to feel like VB: draw, click, wire done.
I really should say something about VB.
The code name for Visual Basic 1.0 was Thunder. I thought it was arrogant when I first heard it. I thought their “Feel the Thunder” attitude was just some cocky boy swagger. I was wrong.
There was a reason every product wanted to be like Visual Basic, they changed their names to Visual <whatever> and tried to get that feel for their programmers. It was because it was just that good. Not everyone had that hot-while-you-type interpreter action going on in their environment but boy was everyone trying to recreate the key elements in their space. We certainly were. By the time we were done Visual Basic had profoundly affected the design experience and the framework – it wasn’t VB but it was Visual – it was Visual C++ (that’s C8 and MFC2 if you’re keeping score.)
We did a very fun follow-on release where we got the 16 bit tools working under Windows NT in 16 bit mode (kudos to their compat people, we were not an easy application to port) and we added more OLE and ODBC support. People were really liking this thing and for the first time since I had been at Microsoft I felt like we had the edge against our competitors in the C/C++ tools space. We were still missing language features but what we had was pretty darn cool.
While that was going on, a few of our number were doing something else that was pretty darn cool. They were porting all this stuff to 32 bits and getting it to run natively on Windows NT. That would be around the Windows NT 3.5 time-frame. That and getting a Japanese version out and generally fixing all of our bad “that doesn’t work internationally” practices. Very important, totally underappreciated work that was.
So Visual C++ “1.1” the NT version was available at about the same time as 1.5, as was the J version.
I guess I would be remiss if I just left it at that. Internally some people thought "1.1" wasn’t a real product it was “just a port.” These are people who clearly know nothing about development tools. The 1.1 system included a totally new 32 bit compiler back end. We all know just how portable back-ends are, especially against totally diverse machine architectures. Oh and it also included a totally new debugger back end with a totally different theory of operation – because of course Windows NT *does* have threads that you can stop. Oh and of course the memory model for the whole program was different – far and near were gone. But ya, other than those few minor things, it was just a port.
That was 1993 and we called the product “Barracuda” – I had not much to do with it personally but those guys deserve a salute in the History.
Things are about to get really exciting though; The most important release of the C++ toolset in my memory is VC++ 1.0 “Caviar” – without it I think our tools would have died. But almost as important, is the next release which I’ll write about in the next installment. VC++ 2.0 “Dolphin” which truly integrated things for the first time.
[I would really be tickled if other people would write their own "My History of VS", either in the comments or on their blog or anywhere they like]
[See The Documentary on Channel 9!
Comments
Anonymous
October 05, 2009
Ah, the memories. I used C 5.1 for years, I could probably still look at disassembled object code and produce the c that compiled it, it was nice and predictable. As for the windows debuggers, it was years before I found anything more productive than running codeview or soft-ice on a hercules monitor, leaving the GUI unmodified. I was actually amazed that the GUI version worked at all.Anonymous
October 05, 2009
Thank you. I consider this sort of a professional career version of Harvard's case-based MBA. Excellent historical information and excellent inspirational guidance for developers.Anonymous
October 05, 2009
The comment has been removedAnonymous
October 05, 2009
I have the Barracuda ship it too Phil, it's better than the "real" ones. :)Anonymous
October 05, 2009
Going by the definition of history given in this blog, I started with QB 4.5. Can't remember much after that until years later when I won a bug competition in a computer magazine. The bug was how to 'close' the Start button in Windows 95 so it disappeared from the taskbar. I won a copy of Visual C++ 6.0 and taught myself C/C++ using it. Later I learnt .NET with Visual Studio 2003 and have regularly used VS ever since. And just to get this off my conscience, I found that Win95 bug on the Internet. It was never mine, I cheated!Anonymous
October 05, 2009
I learned C++ while using Borland CBuilder and Microsoft Visual c++/MFC. I must say that while Visual C++ has always been mostly a very nice environment to work with (albeit slightly quirkY) that MFC really is disgusting to work with and no fun at all. The class library and framework that Borland turned out was much more pleasing from a lowly application programmer's point of view. MFC has stolen countless hours from me with it's strange and sometimes seemingly nondeterministic behaviour. Kudos for VCC though :-)Anonymous
October 06, 2009
Ohhhh the memories...One of such "DOS Extenders" was DOS/4G (DOS4GW.EXE) : http://en.wikipedia.org/wiki/DOS/4GW. I would have liked to be one that can write about "my history of Visual Studio" but I barely touched VC++ 6.0 and then Delphi and finally, in 2004 Visual Studio 2003. I was late.Anonymous
October 06, 2009
As a heavy user of visual studio for c++ for about 10 years I really appreciated this first person account of it's history. I have often wondered bout the nuances of technology's stories. There is so much development and learning behind modern applications it can be dizzying. Unfortunately the history of each product is far too complicated for a single person to be intimately familiar with so in a certain sense we are doomed to repeat earlier errors. Thanks again, I'll definitely share your tale with the guys.Anonymous
October 06, 2009
I think I've used all of those compilers, all the way back to C5.1 in DOS. Today I've been coding in MSVC++ 2005. Thanks for the backstory - very informative. A bit more on the MS view of the competition at the time would have been intriguing. Yes - Borland were very influential in the early 90s. But what about Glockenspiel ? I seem to recall them making all the running with Commonview C++ in the late 80s. Also remember a John Carolan presentation explaining why templates were so cool, but so hard to implement. Those early MS C++ compilers didn't support templates IIRC. And if we go back to ~85, what about Xenix ? What the Xenix C compiler an MS compiler, or was it the std compiler with BSD or System V ? I can't remember whether Xenix was BSD or System V... Yes - for those of you too young, MS once put out a Unix distrib known as Xenix !!Anonymous
October 06, 2009
Rico, great post. I'm looking forward to the next installment. But you might have mentioned that it wasn't until MSC 4.0 (IIRC) that Microsoft wrote its own C compiler. All previous ones were private labelled (i.e. by Microsoft) versions of Lattice C. Speaking of MSC 4.0, the ads (in Byte, Dr. Dobbs, etc) for it showing Codeview just blew me away. And that's what had me salivating, just waiting for the product to ship. And you better believe that I bought that package practically the day it came out. And it didn't disappoint me. Codeview was (and is) simply great.Anonymous
October 06, 2009
The comment has been removedAnonymous
October 06, 2009
The comment has been removedAnonymous
October 06, 2009
Err... thanks... I think :)Anonymous
October 06, 2009
The comment has been removedAnonymous
October 06, 2009
I think I'll do a little aside on Sequoia in the next posting -- I'll be up to Part 3 Do you remember the baby Sequoia tree I planted in my backyard? It's bigger now :)Anonymous
October 07, 2009
That's a serious post! Really back to blogging with a vengeance, eh? I've been using Visual Studio pretty much since version 6. Prior to that I did most of my, er, coding in QuickBasic. I first got into VS more seriously when I decided I wanted to learn the Windows API, GDI, and DirectX in some ill-fated attempts at making games. The games didn't work out, but I did succeed in learning all of those pretty well, and got hooked on VS as my C++ (and now C#) IDE. Using the 2010 Beta now on my own and 2008 at work.Anonymous
October 08, 2009
Thanks for the memories. I'd almost forgotten Windows 2.0, With the (Serial?) screen sitting next to my 16-bit color screen (256 colors with palette management!. I remember doing stuff in C++ later on, but it was not a MS product, not Borland either. It was a pre-processor. Anybody remember the name?Anonymous
October 08, 2009
I'm pretty sure you mean the Glockenspiel version of cfront -- a C++ to C converter.Anonymous
October 09, 2009
Oh man ... Codeview! I've been doing this so long I almost forgot about it! What a great memory, thanks! I still remember coding MASM (and machine code) back in the DOS 1.1 days. Boy was that ever cool stuff!Anonymous
October 09, 2009
The comment has been removedAnonymous
October 09, 2009
I find it funny that you said Bill taught you 6502 assembly. The 6502 was the MOS Technology processer later used by Apple. I still have my original books on the 6502 as well as an original chip or two i wire wrapped in to a computer. Bill was writting for 8080 or Z80's.Anonymous
October 09, 2009
The Commodore PET ran Microsoft BASIC, and from what I've heard over the years it was a pretty straight port of the 8080 flavor. I think at that time Bill still had his hand in the code and I wouldn't be surprised if he wrote a good chunk of it. But I only meant it as a metaphor anyway. By the time I was looking at PET 2001 the Apple had already been out there for several years, so I'd say Apple Integer BASIC was somewhat older than the version used in the PET. The TRS-80 in the lab of course had a Z80 in it. I'm fairly certain Bill knew several assembly language variants. I certainly did :)Anonymous
October 11, 2009
I distinctly remember playing around with a Windows 2.0 version. It had a very rudimentary interface with a couple of games. In fact I vaguely remember even a Windows version 1. Also I am certain that I have actually programmed with VB 1 as well as VB2. I know because EGA cards had just come out and I bought one at that time. I fondly remember the extenders and HiMem and ALT Tabbing into my memory manager to switch between apps. Keep em coming.Anonymous
October 14, 2009
Rico, it is nice to see someone else who started with a Commodore Pet 2001! I appreciate your perspective on how things have developed over the years. I'm looking forward to reading the rest of your entries on the history of Visual Studio. One thing I miss as a developer is the direct interaction with the machine that the old 8-bit world gave us. There is just too much intermediation between programmer and machine these days. I've decided that the first programming experience that my sons will have will be on whatever legacy hardware I can collect. My 12-year old is now learning Microsoft Basic and LOGO on a 25-year old Atari 800 and various old Commodore stock including a Commodore 16 and +4 (just for fun). Once he really gets that direct feedback experience and desire to make the machine dance, he'll be ready to graduate to more modern tools, or perhaps I'll direct him to a Commodore 64 with an Assembler cartridge!Anonymous
October 14, 2009
There are great emulators for these devices by the way, and they are tons of fun :)Anonymous
October 19, 2009
Rico: Thanks for writing this. This brings back so many memories. I joined the party in 1988 and shared in all of these amazing things. Things to add ... shortly after I joined C5.2 was cancelled and the death march to C6.0 began. I worked on NMAKE and took it from the original prototype to the finished product. It which was and still is the most useful tool used to build anything using C and C++ "makefiles". It had it's own mini-parser and C/C++ pre-processor compatible expression evaluator. I then worked on the C++ compiler frontend team and saw the early days of C++, PCH, PDB, debugging and Visual C++. Performance wqas so critical those days that I redid YACC to generate a C++ parse object. {The C++ compiler uses a C++ parse object -- talk about eating your oqwn dog-food}. I did my major work in run-time type identification and pre-parsing templates (that Bjarne said could not be done) and worked on C++ Intellisense -- what we called dynamic parsing -- that used rico's BSC file in a modified NCB format. All Visual C++ projects have these NCB's.Anonymous
October 19, 2009
Did you read the last posting yet Sundeep? We finally got rid of all those ncbs after all these years! Do you remember when we had the bug in the DOS build of PWB and we spent days only to find that we had overwritten a little buffer by one byte which led to a failure reloading PWB much much later. That was the hardest bug find ever :)Anonymous
October 25, 2009
The comment has been removedAnonymous
October 27, 2009
Thank you for the comments -- I would have liked to cover even more but this epic ended up topping 20000 words so I had to abridge a lot. QuickC was cool -- cool enough that it needed a Windows version. Thank goodness we did it :)Anonymous
November 02, 2009
Onc can see the quality difference in VS 6 versus .NET by just comparing the C++/Win32 api documentation with .net 3.x documentation. The C++ doucmenation from the VS 6.0 era is miles beyond in quality. MS has been making good strides in rectifiying this with repopulating the clear cut documenation areas of VB6 (clear cut with .NET 1.x) and older Office versions.