During a recent trip to Microsoft's Redmond campus with Janet Robbins and Mike Otey, we had the chance the sit down and chat with two of the most notable figures in the history of Windows, Mark Lucovsky and David Thompson. For those of you not familiar with the early days of Windows NT, known then simply as NT, both Lucovsky and Thompson played key roles in the development of this important software project. Mark Lucovsky, Distinguished Engineer and Windows Server Architect at Microsoft, joined the company with the original wave of ex-Digital Equipment Corporation (DEC) employees that accompanied NT architect Dave Cutler. Known primarily for his unusual ability to grok how the thousands of components in NT work together, Lucovsky is widely hailed for his technical acumen and his early efforts to change NT from an OS/2-based system to one that ran 32-bit Windows applications. David Thompson, Vice President of the Windows Server Product Group, joined Microsoft in 1990 and led an advanced development group in the company's LAN Manager project before joining the NT team later that year. There, Thompson guided the development of NT's networking subsystem, ensuring that the product would work not just with Microsoft's products but with the outside world.

Here's how it all began.

The NT team arrives at Microsoft

"We came together as a group in November 1988," Lucovsky told us, noting that the first task for the NT team was to get development machines, which were [then] top-of-the-line 25 MHz 386 PCs with 110 MB hard drives and 13 MB of RAM. "They were ungodly expensive," he said, laughing. The first two weeks of development were fairly uneventful, with the NT team using Microsoft Word to create the original design documentation.

"Originally, we were targeting NT to the Intel i860 (code-named 'N-Ten)', a RISC processor that was horribly behind schedule. Because we didn't have any i860 machines in-house to test on, we used an i860 simulator. That's why we called it NT, because it worked on the 'N-Ten.'"

-Mark Lucovsky
Distinguished Engineer
Windows Server Architect

Finally, it was time to start writing some code. "We checked the first code pieces in around mid-December 1988," Lucovsky said, "and had a very basic system kind of booting on a simulator of the Intel i860 (which was codenamed "N-Ten") by January." In fact, this is where NT actually got its name, Lucovsky revealed, adding that the "new technology" moniker was added after the fact in a rare spurt of product marketing by the original NT team members. "Originally, we were targeting NT to the Intel i860, a RISC processor that was horribly behind schedule. Because we didn't have any i860 machines in-house to test on, we used an i860 simulator. That's why we called it NT, because it worked on the 'N-Ten.'"

The newly named NT team had a basic kernel mode system up and running on the simulator by April 1989. "We started with five guys from DEC and one from the 'outside' (i.e. Microsoft), a guy named Steve Wood," Lucovsky said. "And we stayed a tiny group for a long time, through the summer. We thought, 'How hard could it be to build an OS?' and scheduled 18 months to build NT. But we had forgotten about some of the important stuff--user mode, networking, and so on."

By late 1989, the NT group began growing. They added a formal networking team and expanded the security team beyond a single individual who, incidentally, had also been previously burdened by file system and localization development. "We grew that first year to 50 people or so," Lucovsky said. "And within a year, we finally had the first functioning i860 prototypes, so we could use those instead of the simulators. We started looking at context switch times, to get an idea of how well it would perform. It became obvious almost immediately that the i860 would never work out. So we started looking at the MIPS architecture, another RISC design."

In December 1989, the NT team made the decision to ditch the i860 and target the MIPS R3000chip instead. "Within two or three months, we were booting NT on real hardware in Big Endian mode," Lucovsky told us, "and our architecture really paid off. We had designed NT to be portable, and we proved it would work almost immediately when we moved to MIPS. We made the change without a lot of pain."

By this time, the NT team started expanding rapidly, with most of its members now coming from the ranks at Microsoft. The graphics team was greatly expanded, once a new style of doing graphics was created. They also started an NT port to the Intel i386, which was the mainstream PC processor at the time, but Lucovsky explained why it was important to the team that they didn't target the i386 initially. "We stayed away from the 386 for a while to avoid getting sucked into the architecture," he said. "We didn't want to use non-portability assumptions." If they had targeted Intel's volume chip from day one, he said, they would have had a higher performing system initially, but it would have hurt NT in the long run, and made it harder to pursue new architectures as they did recently with the 64-bit Itanium versions of Windows Server 2003.

NT becomes Windows NT

"By the spring of 1990, we had the MIPS version limping along and we started the 386 version in earnest," Lucovsky said. "It was another huge growth spurt." That May, Microsoft released Windows 3.0 and, suddenly, the world took notice. Windows was a smash success, and the obvious future of PC-based graphical computing. "We started looking at Windows 3.0 and said, 'What if, instead of OS/2, we did a 32-bit version of Windows?'" Lucovsky noted, casually throwing out the question on which the next decade of computing hinged. "Four guys--Steve Wood, Scott Ludwig, a guy from the graphics engine group, and myself--looked at the 16-bit Windows APIs and figured out what it would take to stretch them to 32-bit. We spent a month and a half prepping the API set, and then presented it to a 100-person design preview group to see what they thought."

The key characteristic of the new API, eventually named Win32, is that, though it was a new API, it looked and acted just like the 16-bit Windows APIs, letting developers easily move to the new system and port their applications. "We made it possible to move 16-bit applications to NT very easily," Lucovsky said, "and these applications could take advantage of the unique features of NT, such as the larger address space. We also added new APIs that weren't in the 16-bit version. We added major new functionality to complete the API, making it a complete OS API, but we did this using a style that would be familiar to the emerging body of Windows programmers."

The reaction within Microsoft was immediate. "They loved it," he said, "when they saw how easy it would be. It was basically Windows on steroids, and not OS/2, which used a completely different programming model." Making NT a 32-bit Windows version instead of an OS/2 product, however, introduced new issues, not all of which were technical. Microsoft had to get ISV and OEM approval, and of course alert IBM to the change. "We did an ISV preview with IBM, and had this deck of about 20 slides, and we said, 'look, this is what we're going to do.' At first, they thought Win32 was a fancy name for OS/2. Then you could just see it on their faces: 'Wait a second, this isn't OS/2.'"

The decision to drop OS/2 for Windows forever damaged the relationship between the two companies. "But we had executive approval, and started the port," Lucovsky said. "So instead of working on an OS/2 subsystem for NT, we picked up Win32." At that moment, he said, the product became Windows NT.

"Our core architecture is so solid, that we were able to take NT from 386-25's in 1990 to today's embedded devices, 64-way, 64-bit multiprocessor machines, and $1000 scale-out server blades."

-David Thompson
Vice President
Windows Server Product Group

NT's modular architecture paid off during this change as well. "Thanks to our microkernel architecture, with the kernel decoupled from application environments like POSIX and Win32, we didn't have to change the kernel or start a new programming effort," Lucovsky told us. "The deep guts of the scheduler didn't have to change. We had C command line applications up and running within two weeks. This was September 1990."

Thompson elaborated on the importance of NT's foundations. "Our core architecture is so solid, that we were able to take NT from 386-25's in 1990 to today's embedded devices, 64-way, 64-bit multiprocessor machines, and $1000 scale-out server blades. We've been able to deliver a whole array of services on it."

September 1990, truly, was the turning point for Windows NT. Not coincidentally, that's also when Dave Thompson, previously heading Microsoft's LANMAN for OS/2 3.1 advanced development team, joined the NT team. "We threw the switch," Thompson told us, "and the team went from 28 to about 300 people. We had our first real product plan."

RTM and beyond

The first version of Windows NT, Windows NT 3.1, was released in July 1993 and named to match the version number of the then-current 16-bit Windows product. That NT version featured desktop and server editions and distributed security in the form of domains. Since then, the NT team has worked on a progression of releases, all developed on the same underlying code base.

The next release, Windows NT 3.5, was code-named Daytona, and shipped in September 1994. "Daytona was a very rewarding project," Thompson said. "We focused on size and performance issues, and on "finishing" many of the first-release features of 3.1. Daytona also had significant functional improvements and enhancements." The original themes for Daytona were size, performance, compression, and Netware compatibility. Two of those goals were emblematic of the time: DoubleSpace-style compression was a hot topic in the early 1990's because disk space was at such a premium, and Netware was the dominant network operating system of the day. "We eventually dropped compression," Thompson said, "but the Netware port was strategic. Novell was ambivalent about the NT desktop ? they didn't know if they wanted to build a client. We offered our assistance, but they kept messing around and ... well. We did our own. And it just blew them away. Ours was the better Netware client, and customers used ours for years, even after they finally did one. That client enabled the NT desktop, because Netware was the prevalent server in the market. We wouldn't have been able to sell NT desktops otherwise."

NT Timeline

October 31, 1988: David Cutler arrives at Microsoft
November 1988: Work begins on NT project
July 27, 1993: Windows NT 3.1 ships
September 21, 1994: Windows NT 3.5 ships
May 30, 1995: Windows NT 3.51 ships
July 31, 1996: Windows NT 4.0 ships
February 17, 2000: Windows 2000 ships
October 25, 2001: Windows XP ships
April 24, 2003: Windows Server 2003 ships

Daytona also benefited from new compiler technology which enabled Microsoft to compress the code size and enable realistic NT desktops on lower-end systems than the original version. "The results were measurable," Thompson said.

Windows NT 3.51 was dubbed the Power PC release, because it was designed around the Power PC version of NT, which was originally supposed to ship in version 3.5. But IBM constantly delayed the Power PC chips, necessitating a separate NT release. "NT 3.51 was a very unrewarding release," Thompson said, contrasting it with Daytona. "After Daytona was completed, we basically sat around for 9 months fixing bugs while we waited for IBM to finish the Power PC hardware. But because of this, NT 3.51 was a solid release, and our customers loved it." NT 3.51 eventually shipped in May 1995.

Fittingly, the next NT release, Windows NT 4.0, became known as the Shell Update Release (SUR), another challenging task that would once again prove the benefits of NT's module architecture. "We wanted to build a desktop that had the 95 shell but used NT technology," Lucovsky told us. "We eventually moved the Win32 GUI components and hosted them as an in-process driver. Performance was one side effect. We had had problems taking that API and running it in a different process. So moving the code to the same context as the runtime solved a lot of issues. We didn't have to do dead lock detection for GDI and USER. It was significant work, but it solved a lot of headaches." NT 4.0, a watershed release for the product, shipped in July 1996.

Windows everywhere

With the next release, Windows NT would lose the NT name and become, simply, Windows. Thompson says the decision came from the marketing team. "A guy from the Windows [9x] marketing team moved over to NT marketing and said we should use Windows everywhere. We were all uncomfortable with the name change at first, because NT had a solid reputation. But because of the reliability push with Windows 2000, people started talking about how much better Windows 2000 was than 'that old NT stuff,' even though it was the same architecture. So it was actually kind of fortuitous how it happened." Incidentally, Windows 2000 didn't have a codename "because Jim Allchin didn't like codenames," Thompson says.

Since the completion of Windows 2000, the biggest decision the Windows team made was to split the client and server releases with the Whistler products, which became Windows XP and Windows Server 2003. "This lets us focus on the server customers, who want it rock solid, rather than right now," Thompson told us. "Desktop software has to ship in sync with [PC maker] sales cycles. There is no holiday rush with servers."

On to Part Two...

In Part Two of Windows Server 2003: The Road to Gold, we take a look at the Windows development process!