BXT

Back How Doom didn't kill the Amiga

How Doom didn't kill the Amiga

A detailed account of the rise and fall of an Amiga zealot

Early 2024


A scene from the Amiga game Defender of the Crown, released in 1986.For years, no other home computer came close to screens like these.

Ever since I saw an Amiga 500 at a friend's house in what was probably late 1988, I wanted one for myself. Back then, computers were uncommon, especially at home. Even though I went to a school in a fairly affluent neighborhood, few kids had home computers or video games.

Gradually, that started to change.

I bought my own Amiga in February 1992. It was basically the same exact model as the one I had first seen in 1988: An Amiga 500+ with a 7 MHz 68000 CPU, 1 meg of RAM, 8-bit stereo PCM sound and many various graphics modes, of which 320x256 in 16 or 32 colors was the most common for games.

Competing Platforms

Buying any other computer was completely out of the question. There were practical reasons of course: I knew how it worked and friends had one, which meant we could copy pirated games from each other. And it ran Deluxe Paint, an era defining graphics program and a killer application for anyone with artistic inclinations, which I'd convinced myself I had.

I had also come into contact with other types of machines. The C64 felt like a thing of the past, with blocky graphics and beepy sound. The Mac was a boring monochrome computer made for writing equally boring documents about tax deductions. I intensely remember seeing a PC for the first time, and how disappointed it made me: it was much worse than the Mac, possibly even than the C64. Downright ugly graphics, terrible sound and a mysterious operating system that required you to learn textual incantations by heart.

But the real home computer feud during those days was between the Amiga 500 and the Atari ST, in which the latter always seemed to come out losing; worse sound, worse graphics, worse OS. Such a machine was completely out of the question - it just had to be an Amiga.

I don't regret this decision one bit. Few gadgets have given me as much joy and positive experiences as my Amigas, and even in 1992 an Amiga 500 wasn't a bad purchase. The PC was still a fairly boring machine, having a hard time keeping up with the Amiga's sound and graphics without costing an exorbitant amount of money. The revolutionary Amiga architecture, released in 1985 and basically unchanged since then, could still hold its own: a testament to its ingenuity.

The Price is Right

Around this time, PC:s were showing signs of becoming affordable, and Amiga style platform- and arcade games had begun appearing on the platform. But even the most straightforward of these titles required a fast 386 CPU, plenty of memory and a VGA card to come close to the amount of motion and commotion the Amiga had been capable of for ages. This meant the price to play was still high: In Sweden, around Christmas 1992, a 25 MHz 386/SX with 2 megs of RAM and a 40 meg hard drive was selling for $530. To this came the additional cost of a sound card. A Sound Blaster 2.0, guaranteed to be supported by most games, was available for a whopping $129. While you could hook the Amiga to a cheap 14" TV via RGB SCART, the PC required an expensive VGA monitor, on which you couldn't also watch MacGyver. Bummer!


Shadow of the Beast, released for the Amiga in 1989. It featuredlush graphics, multiple layers of parallax scrolling, massive spritesand tough as nails gameplay. It looked like crap on pretty much everyother system, and wasn't even ported to the then inferior PC.

This put the cost of an entry level 386 system at $660 without software, compared to the Amiga 600's $499, and the A600's 40 MB hard drive came pre-loaded with Deluxe Paint, a word processor and several decent games. It's true that a 386 PC was much faster than the 7 MHz Amiga in some aspects, but most of those aspects didn't matter for people like me and my friends - at least not yet.

More Moore

It's hard to convey just how intense the effects of Moore's law were during the 90's. When it came to hardware, December 1992 was a very different time compared to January 1992. During this year, Commodore pulled a stunt by first introducing the A500+ and then swiftly replacing it with the A600 - a low cost model with the advantage of having an integrated IDE interface, only to finish off by making the A600 obsolete with the release of the next-generation A1200. They were widely criticized for this, but in their defense, home computing was moving at breakneck speed and nobody could really keep up. At any given moment, something that had been unfeasible just six months ago was suddenly commonplace. One such something was Doom - which was nowhere, and then suddenly everywhere. Some Amiga fanatics still claim that Doom was what killed the Amiga, but I don't believe that to be true. Doom was a symptom, not the disease proper.

Commodore's last sigh, the Amiga 1200, has been touted as a bad machine and an architectural mistake. Its 14 MHz 68020 CPU, 2 megabytes of RAM and AGA chipset is often, in hindsight, dismissed as too little, too late. Considering this, it still sold fairly well in Europe - because as a home computer, it actually wasn't half bad. It could be equipped with a hard drive, it had an expansion slot for CPU and memory upgrades, and the new graphics modes actually rivalled VGA and even SVGA in many ways. The 1280x512 pixel mode in 18-bit color was too slow for games, but when displaying still pictures it was quite a sight to behold.


One of the few surviving photos of my Amiga 500+. Here I'mshowing my grandmother something that looks like a shell window,and she's pretending to be interested. T-shirt sizes were very comfortablein those days.

To me, even in 1994, switching to another architecture still wasn't under consideration. I was now an Amiga fanboy and the platform had become my natural home. And why not? An entry level A1200 hooked to a TV set wasn't the worst of choices for a boy in his early teens and it was of course a natural step up from the A500. My peripherals and nearly all of my software still worked and I could keep using my trusty old 14" TV, on which I could also watch The X-Files without parental interference. It was a platform I was comfortable with, and I got a good second hand deal - it even came with a hard drive, which was what I wanted for my computer most of all at that point.

Still, times were changing. PCs were no longer just for very rich kids and office professionals: any serious gamer had to consider one, and not just because of Doom. PC games were perhaps not yet as colorful, zippy and funky as Amiga games, but they were often very complex and relied heavily on a fast CPU to manage that complexity. Most people didn't really have a fast enough PC to fully enjoy Doom at its time of release in 1993 - but just a year later, expectations on home computing had changed. Even though 486 machines hadn't dropped that drastically in price yet, hardly anyone even marketed 386 machines anymore.

Memory was cheaper, as were hard drives. Better graphics and sound fidelity was expected, and it seemed every new PC graphics card that came out offered higher resolutions and more colors than the previous. CD quality sound was suddenly a thing, as was putting a CD-ROM reader in your computer. Using a flickering 50 Hz PAL TV instead of a rock solid 60 Hz VGA monitor was no longer seen as clever frugality, but rather as a way of hurting your eyes when trying to play Sim City 2000 in its high resolution 640x480 glory - which the A1200 honestly wasn't quite fast enough to do anyway. Adventure games like Monkey Island II, Simon the Sorcerer and Indiana Jones and the Fate of Atlantis came on eleven (11!), nine (9!) and eleven (11!) floppy disks, respectively. The Amiga 500 had traditionally been a floppy-based system, but a hard drive was now more or less required, even for gamers.


A cutscene from the Amiga game Agony (1992). It took a while,but the PC eventually caught up, graphics wise.

In spring of 1994, around the time of Commodore's demise, a 33 MHz 486 PC with an SVGA card, 4 megs of RAM, a 200 meg hard drive and a 14" monitor cost $1300. For an Amiga 1200 to even begin to come close to such a system, you'd have to spend at least as much money. And while the PC still came without a sound card, a high resolution monitor for the Amiga had to be able to auto-switch between 15 kHz (PAL) and 31 kHz (VGA), pushing the price of the Amiga system higher still. Add to that the fact that the PC was a big box machine, with plenty of room for, say, a CD-ROM drive - the hot new thing.

The DOS conundrum

By 1995, Commodore was well and truly dead. It was during this time I started getting acquainted with other platforms in earnest: DOS, Windows, Linux, newer Macs, even Unix workstations. I had plenty of friends who owned PCs, but I stayed true to my Amiga, adding CPU and RAM upgrades that cost as much as the computer itself. I could have saved money and bought a PC - but I honestly didn't see the point. Linux was nice for surfing the net, but it lacked all of the fun software I craved: games, demos (as in the demo scene), graphics programs, tracker music. It didn't even have the things I wanted for school, such as a reasonable word processor with Swedish spell checking.

All of this and more was of course available on PCs, and I was frequently exposed to it when visiting my PC owning friends. Deluxe Paint, my beloved killer app for the Amiga, apparently existed on PC as well. And there were lots of other neat programs that appealed to a young demo scener: Fasttracker and Screamtracker for music, QBasic and Turbo Pascal for programming, QPEG for viewing images, TheDraw for making ANSI graphics, and droves of pictures, music, games and scene demos.

The strange thing was that all of the fun PC stuff was made for DOS. I just didn't get it. Command lines no longer put me off, having dabbled quite a bit with them on both Amiga and Linux, but DOS lacked that one crucial thing I had grown ever more accustomed to on my Amiga: Native, effortless, pre-emptive multitasking.

The PCs were indeed impressive, hardware-wise. SVGA was now commonplace, as were 486/DX processors running in 33 or even 66 MHz. Doom was not only playable, but enjoyable. PC scene demos were running impressive texture-mapped 3D effects much faster and, with the addition of modern sound cards, offering much better audio fidelity than my Amiga was capable of. Ostensibly simple things like loading JPEG images felt instant compared to my A1200, even after it was upgraded with a 28 MHz 68030 CPU.


Things like these were perfectly reasonable to MS-DOS users. Expanded or extended memory? Or maybe just conventional? Choose wisely, or you won'tbe able to fully maximize the incredible potential of running a single program at a time.

What I couldn't grasp was the point of having (and paying for) all that raw power if you couldn't utilize it fully. On my Amiga, I could run Amos Professional - an advanced BASIC dialect - and Deluxe Paint at the same time. If I wanted to change an image used in an Amos program, switching between the two applications was instant. If I suddenly wanted to take some notes, I could just fire up a text editor. And I could listen to music at the same time! In DOS, even something as mundane as enjoying tracker music was a full time, full screen activity - not something you could keep doing while working in other programs.

This multitasking was and is a big part of my affinity for the Amiga, and it opened up the possibility for several other ingenious features. Workflows were completely customizeable with powerful scripting languages, the modular design of the OS made it extremely adaptable to things like new file formats and file systems, and every aspect of it could be tweaked and configured to suit personal needs. It was (and still is, in many ways) like running a carefully honed environment from a professional workstation on a cheap home computer. DOS, Atari and Mac just couldn't compare.

Furthermore, DOS seemed to require inordinate amounts of tweaking and configuration. Hardware interrupts had to be configured manually, the 640 kB base memory had to be carefully guarded, and you had to select and configure your sound and graphics cards in almost every single game you wanted to play. On the Amiga, everything just worked. If you bought a new peripheral, all you had to do was plug it in and boot your computer. Drivers were needed for some things, of course, but the Amiga's Autoconfig took care of peripheral detection and configuration.

When Dial-up Internet became (almost) affordable for the masses, the Amiga delivered there, too. PC users had to load up Windows 3.11, with its questionable cooperative multitasking, but the Amiga felt as smooth as ever. While waiting for a slow FTP download, I could chat with friends on IRC, play (simple) games and listen to music. A fast 486 with plenty of RAM could perhaps do the same, but not yet with my Amiga's inherent elegance.

Hence, the Amiga still felt like a superior platform. Its swift multitasking, efficient resource usage and many clever ideas made both DOS and Windows feel clunky and primitive. Even though Windows 95 had entered the scene, it was more or less unusable without at least 8 megs of RAM. And all the fun stuff on PCs was still being made for DOS.

Doom me once, shame on you

What about Doom, then? John Carmack himself has allegedly said that he didn't consider the Amiga as being capable of running Doom. As an Amiga zealot, I'd of course like to point out that he was wrong - it's since been ported to the Amiga and runs just fine. But in my heart of hearts, I know that in 1993, he was actually right. The only Amiga available at that time powerful enough to make Doom palatable would have been an Amiga 4000 with a 25 MHz 68040 CPU, costing somewhere around $2500 - without a monitor.

Commodore was frequently derided for not producing yet another killer computer, a proper new Amiga model as revolutionary as the Amiga 1000 had been in 1985. This complaint is somewhat valid, but still, I think, misses its target. It's not that the Commodore engineers didn't have plans - and even prototypes - for much more advanced graphics hardware than the AGA chipset in the A1200. There are just plenty of other reasons for why things might not have gone as expected, even if they had reached market.


Ruff'n'Tumble on Amiga (1994): huge bosses, beautiful graphics and frantic action.

VGA was deceptively simple. A framebuffer, more or less, which provided the famous chunky Mode 13h - part of what made Doom possible. No hardware sprites. No blitter for fast memory copying. No copper (co-processor) like the Amiga had, which could change the color of a given palette index every scanline. But you could write once to memory for a single given pixel, and get any color from an indexed palette of 256 values. And you could run spreadsheets in a steady, 60 Hz 640x480.

The Amiga had been designed when sprite-based games were the hottest thing since sliced toast, when memory was still stupendously expensive and when the ability to display 80 column text was considered a noteworthy feature on many home computers. As opposed to VGA, the Amiga had planar graphics, requiring multiple writes to memory to produce a single color value - a perfectly reasonable choice for the time, and one that enabled a lot of other nifty programming tricks and visual effects. The Amiga architecture was so tuned for arcade action that even in 1994, games like Ruff'n'Tumble showed that the 7 MHz Amiga 500 could still hold its own against powerful PCs when it came to fast paced 2D shooters. The problem was that after having dominated the market since first showing up in arcade cabinets, games in that style were becoming unfashionable.

By now it should be clear that what really drove the PC boom was neither Doom nor chunky graphics: it was cheaper and faster CPUs. Chunky-to-planar conversion on the Amiga does steal a few CPU cycles, but even if the A1200 had been equipped with a chunky mode, its 14 MHz 68020 processor would've been far too slow for Doom. Motorola's 486 equivalent, the 68040, hadn't been subjected to the price drops of mass produced PC clones and competition from rival manufacturers, such as AMD's and Cyrix' 486 compatible offerings. Put simply: Commodore could have crammed an actual VGA card into the A1200, but its CPU would still have been far too slow for Doom. And even if Doom had never materialized, an affordable Motorola CPU still couldn't crunch the numbers needed for increasingly complex simulators and strategy games.


Microsoft Flight Simulator 5.0 - tremendously slow even on the recommended minimum 386 CPU.

The later 90's

Some time after 1995, I and many fellow Amiga zealots had started feeling an itch that was hard to scratch. It's true that this itch came, in part, from Doom. Not so much from wanting to play it, but from the desire to show the world that yes, the Amiga was in fact capable of running it. With the correct, expensive upgrades, mind you - but still.

Alas, the itch was also caused by other, more pressing matters. The remnants of Commodore had been bought by Escom and then Gateway, but no new Amiga models had materialized. Despite this, we had dutifully upgraded our CPUs, expanded our RAM, and kept true to our machines - but nothing of substance had materialized from the new owners of Commodore's IP.

There had been poor attempts at Doom clones on the A1200, but they all somehow lacked the parts of Doom that made it Doom. Surfing the net was now, to be honest, quite painful even on an accelerated Amiga with an expensive dual-sync monitor. The higher horizontal refresh rate (31 kHz) congested the AGA chipset and made 256 color screens unbearably slow. And all those JPEG files took forever to decompress! Some pages didn't look right, either: they were designed for Netscape, a program not available on our platform. In short, we were desperate for the launch of new Amiga hardware.


The pinnacle of Amiga zealot humor in the mid-90's. Hyuk, hyuk, ackshually…

We were used to walkover victories in comparisons between computer brands, but it was now painfully obvious that our beloved Amiga was lagging behind. We didn't know how to respond and many of us considered remarks or even simple questions about our platform to be personal insults. This made us completely insufferable, and we spent inordinate amounts of time jeering both on- and offline about "Micro$oft", buggy Pentiums, cooperative multitasking and "Plug'n'Pray". Littering IRC with statements like "Windows 95 = Amiga 85" achieved nothing except making us look obnoxious. But teenage frustration mixed with sunk cost is a tough mix of emotions to combat: We had invested too much time, money and prestige to give up now. The Amiga would surely rise again! Except, of course, damn you if you even suggested the use of cheap Intel CPU:s at the core of the platform.

In 1997, the Doom source code was released and Amiga ports started cropping up. With a very expensive CPU expansion, it was perfectly playable on an A1200 - but nobody cared about Doom anymore. It was all Duke Nukem 3D and Quake now, and MP3 files, and fast Pentium MMX CPUs or even the impressive Pentium II and dedicated 3D graphics cards.

I was now running a 40 MHz 68040 CPU in my A1200, but even that couldn't keep up with the cheapest of PCs. DOS was more or less gone by now, and the fun stuff had started, little by little, to move into Windows.

In 1998, I finally caved and bought a powerful Pentium II PC. I was running a dual-boot system with both Windows 95 and Linux, but eventually Windows NT 4 won out. I could surf the web as well as anyone, play Quake III (after a graphics card update), program web pages using Microsoft's Personal Web Server, ASP and Access databases, chat with my friends on IRC and even watch DivX movies.

But there was still something missing. The fun just wasn't as fun on Linux and Windows. I guess a lot of people like me felt the same: even in 1999, the best demo scene stuff for PC was still clinging to DOS. Personally, I missed all the clever stuff my Amiga and its OS did, and the great and familiar software it ran.

So I just didn't stop using my Amiga. In the early 2000's, I upgraded to an impressively fast 50 MHz 68060 CPU, the last in Motorola's 68k series. I even bought an expensive, towerized A1200 with a 24-bit ("SVGA") graphics card and an ethernet card. It was ridiculously expensive and underpowered compared to any off the shelf PC available at the time - and yet, such a beefy Amiga was a paradox. It was a boyhood dream five or ten years too late - and also less of an Amiga than I perhaps cared to admit to myself at the time. The really fun stuff like Deluxe Paint, Amos and scene demos didn't run on the graphics card. It was more of a platform for running AmigaOS than an Amiga proper, and I eventually ended up downgrading to a more modest configuration. While it couldn't keep up as a daily driver, it was still a computer I booted up regularly, just to have fun.

It still is, and I still do.


Network configuration on a souped up Amiga 1200, circa 2003.

Custom Silicon

Doom didn't kill the Amiga - it was more like a measure of the many nails in the coffin of custom hardware home computer platforms. The biggest culprit was economies of scale.

Popular 8-bit home computers had, over time and quite naturally, been replaced by 16- and 32-bit machines. Many manufacturers of unique 8-bit machines during the Cambrian explosion of home computers simply went defunct or started making PC clones. By 1995, the only remaining, popular 32-bit home computer system was the PC. Apple regularly tried to muscle in on the home market, especially during the mid-90's with their PowerPC machines, but without much success. During this time period, a Mac was more of a niche office machine than the PC had ever been, and Apple survived mostly by selling systems for magazine and print ad production. They were in fact dangerously close to folding when Steve Jobs stepped back in and managed to secure funding from Microsoft (as "anti-trust insurance") and launch the iMac just in time to ride the wave of the Internet boom.

Commodore had been able to sell their successful 8-bit machines and early Amiga models cheaply thanks to vertical integration. This basically meant that they owned the chip fab, MOS Technology, that manufactured the Amiga's custom chips (and even the C64's 6502 CPU). At the time of its release, this made the original Amiga almost bizarrely cheap compared to equivalent Mac and PC offerings with similar performance.


A Swedish Amiga 4000T advertisement from 1997. For the listed SEK 29990 (roughly $3000 in today's exchange rate), you could get a fully specced out Pentium II machine - with a decent monitor - running in circles around the Amiga. If you forked out this much for an A4000 at this point in time, I dare say you were stupid. On the inside of your brain.

But in 1994, when Commodore went bust, the PC clone market had been in full swing for over a decade. Cobbling together a 486 system using mass produced parts proved, in the end, to be far more competitive than designing, prototyping and manufacturing your own complex graphics and sound hardware for a single platform. Besides being cheap, this kind of standardization had more advantages. One was that PCs worked the same all over the world. Since traditional home computers were meant to work with television sets, they had to be timed to either the PAL or NTSC signal standard, which also meant that games (and other software) were timed to this as well. Games made for the European market didn't work on US Amigas without being rewritten, and vice versa. PCs didn't have this problem, providing a global market without added development cost.

Commodore's never completed AAA and Hombre architectures sound impressive on paper, but were still not finished when the A1200 and A4000 were released in 1992. Besides, It's easy to list specs for nonexistent hardware in hindsight. Even if AAA had become what was promised, it wouldn't necessarily have been competitively priced. Even with the AGA machines, several cost cutting measures had been taken and they still struggled when it came to pricing. Would an even more advanced architecture somehow, magically, have sold as cheap or even cheaper? I have my doubts.

It's quite possible that the Commodore engineers could've worked a chunky mode into the AGA chipset. But, as discussed above, the A1200 would have needed a much faster CPU if Doom - or any other "killer app" game - was ever going to be a possibility. A 68040 would have been far too expensive, but even a 40 or 50 MHz 68030 CPU would've put the machine at a decidedly different price point. Combined with a hypothetical new graphics architecture, we can only speculate about the cost. The Amiga was known for being good and cheap, and it was proving hard not just for Commodore to combine the two.

Falcon Heavy

Many of Commodore's mistakes are said to have occurred after its founder, Jack Tramiel, left the company - even though the Amiga, a great success by all accounts, was bought by Commodore after his departure. Perhaps Tramiel's hardline approach to business ("Business is war!") and cost cutting ("Computers for the masses, not the classes!") could have led Commodore and (had they still bought it) the Amiga down a different path, but history tells us otherwise.

Tramiel left Commodore for Atari Corporation, a company that abandoned the home computer market in 1993 after their last-ditch effort, the Falcon 030. Like the A1200, the Falcon wasn't really a bad computer - but it cost even more than a souped-up Amiga 1200, and was still, in many aspects, underpowered compared to 486 PCs. It didn't come with the impressive specs listed for Commodore's AAA chipset, but its capabilities were far more advanced than those of the A1200. Its graphics chip, ViDEL, could produce a wide array of impressive resolutions, including a chunky 16-bit truecolor graphics mode. It had 16-bit sound, a faster CPU (16 MHz 68030), and a 32 MHz Motorola 56001 digital signal processor. This, together with both IDE, SCSI and networking hardware made for a very capable machine indeed.


The Atari Falcon 030 still had the classic "home computer in a keyboard"form factor.

When considering all of this, the Falcon was actually competitively priced - but that didn't matter. People who needed SCSI or networking in 1993 usually had other budgets and priorities than home users, such as running WordPerfect or Lotus 1-2-3 for DOS. People who wanted great sound and graphics for gaming didn't want to pay extra for stuff they'd never use.

All those bells and whistles, together with the need for keeping the BLiTTER and YM2149F chips for backwards compatibility, meant the Falcon was too curious, inflexible and expensive for something that was supposedly a home computer. In the end, all its killer architecture managed to kill was, sadly, Atari itself.

Early birds and worms

What if Commodore had managed to put out a new, revolutionary Amiga model much earlier than 1992? Perhaps this could have saved the platform, ensuring its continued longevity? Maybe - but even if the Amiga 1000 launched in 1985, Amigas didn't become popular until the release of the cheaper, stabler and more mature Amiga 500 in 1987 - the same year IBM launched VGA. Could Commodore have released a VGA killer in 1988? As discussed in the beginning of this text - at this point in time, the Amiga already was a VGA killer. Consumer level PC:s just weren't fast enough to do something interesting with 256 colors, and VGA cards were much too expensive anyway. To keep a truly competitive cutting edge, the Amiga would have needed not just a new graphics architecture, but probably a CPU upgrade as well - raising the total cost of the machine considerably.

Commodore could surely have produced something very impressive, but again, at what price point - and would it have mattered? Most PCs sold at this time were still turbo XT clones with crappy CGA graphics and yet, PC dominance was already well established in the business sector. This also meant it had started seeping into homes where parents saw an opportunity of running spreadsheets and letting the kids play games and do their homework on the same machine, instead of buying two expensive family computers.

Commodore was often derided for their poor attempts at marketing the Amiga - but I think that's a bit unfair, too. The Amiga was launched at a huge press event where Andy Warhol and Debbie Harry famously appeared to show off the computer's graphics abilities. This was followed by print and TV ads that clearly and vividly showcased the advanced hard- and software. In Europe, Amigas sold like hotcakes and Commodore UK were very successful in bundling hardware with popular software titles. And, due to its graphics and video capabilities, the Amiga was a well known and popular machine in broadcasting circuits - in large parts thanks to the impressive Video Toaster.

Could the Amiga have muscled in on the office market in a meaningful way? I honestly don't think so - Apple couldn't, Atari couldn't, in fact nothing but IBM compatibles could. The Amiga 3000 - a high end model often considered to be some of Commodore's finest work - is said to have piqued the interest of Sun Microsystems, who wanted to license it as a low end UNIX workstation. The deal fell through, much to the chagrin of Amiga fanatics across the globe - another oft-cited example of Commodore's failure as a company. Even so, Sun machines catered to a different niche market than IBM PCs, and while the notion of a mass produced Amiga UNIX workstation sounds cool, it's questionable if it would somehow have made consumer Amigas cheaper, faster and better - or simply led Commodore onto a path of expensive high end machines competing with the likes of SGI, HP and DEC. At the very least, considering it would've been running UNIX, it probably wouldn't have resulted in more resources allocated for AmigaOS development.

The real question here is, I think, if we would actually have wanted the Amiga to become yet another boring office machine. Everything that was fun and great about it was, to me at least, also what made business execs so suspicious of it.

Last breaths

With the demise of Commodore and Atari, the traditional home computer was basically dead. Acorn stayed in the game until 1998, but their expensive Archimedes line of machines was never very popular in actual homes, surviving by being more or less subsidized by British schools. Their subsequent RiscPC models catered mainly to various niche actors in broadcasting, including the BBC. Their legacy is now carried on by the popular Raspberry Pi computer.

Even highly specialized machines, such as Unix workstations, were living precarious lives by the end of the 90's. SGI, Sun, Digital and IBM focused their efforts more and more on servers and less on desktop machines. The last new Unix workstation models were both launched in 2006, by IBM and Sun. Even in this lucrative segment, competition from cheap PC hardware ultimately proved insurmountable.

With this in mind, a new killer architecture from Commodore may have been impressive, but it might not even have been an Amiga as we know them. It could have been a completely new machine. It could have been (mostly) compatible with the "classic" Amiga by somehow incorporating the old Amiga into the new one - still synced to PAL or NTSC, with all that entails, and of course adding cost to the machine. It could have become some kind of short-lived UNIX workstation or perhaps a games console. Or, it could have become a new Voodoo style graphics card for PCs - a platform that was also manufactured by Commodore, and quite profitably at that.

Both the Atari Falcon and Amiga 1200 were already suffering from minor problems with backwards software compatibility, something that probably made a lot of consumers more open to switching to PC. It's of course impossible to say for sure, but certainly not unthinkable, that the AAA architecture would have failed miserably to run the existing, bare metal banging games (and applications) an upgrading user would have expected to work on an "Amiga".

In short: Commodore might have lasted longer as a company - in some form - had they made other decisions. Such decisions, however, may not have been ones guaranteeing the continued existence of a platform recognizable as an Amiga. Apart from the Mac - in many ways thanks to Microsoft promising continued support and providing a $150 million cash injection - no other home or desktop computer platform survived the 1990's PC dominance in any meaningful way. It seems highly unlikely that the Amiga would somehow have propelled itself back into a significant market position thanks to a chunky graphics mode or, considering the fate of the Falcon 030, even a reworked architecture.

All but gone

The Amiga was an amazing platform, so far ahead of its time it stayed alive for much longer than what seems reasonable. It came out during the end of the Cambrian home computer explosion and remained in production for close to ten consecutive years. Saying it wasn't successful because of Commodore's lack of business savvy is doing it a disservice: Many, many millions of Amigas were sold and it was, for several years, the dominant home machine in Europe, where it shaped a generation of curious, capable and creative computer users.

It was, in fact, so popular and successful that even long after Commodore's demise, there were drawn-out efforts to modernize the platform. Here we can glean another of the many nails in the Amiga's coffin: the Amiga fanatics themselves. Most were now identifying so deeply with their platform that, say, suggestions of porting the operating system we loved to cheap and plentiful X86 hardware was considered heresy. It had to be PowerPC or nothing, despite the failure of the BeBox - which in many ways was more of a modern Amiga than any of the officially sanctioned attempts.

This eventually resulted in the AmigaOne series of PowerPC machines launched in the early 2000's. Keen supporters of the platform can nowadays purchase a motherboard with a 2 GHz dual core CPU for a mind-boggling $2000 (Yikes!). Add to that the cost for graphics, sound, memory and everything else. Spending that kind of money will get you a computer mostly useful for finding out why you don't want to run outdated ports of Linux software on an operating system without memory protection.


Pay stupid prices, win stupid hardware.

It's not without irony that the once cheap, integrated, cutting edge Amiga platform has now become a ridiculously expensive PC-style kit computer, running an OS kernel that seems almost as primitive today as MS-DOS once did when we Amiga zealots smugly bragged about multitasking.

A humbling journey, to say the least.


source: https://www.datagubbe.se/afb/
https://bxt.org/togtz