Today marks the 25th Anniversary of the original Apple Macintosh.

And while Apple may currently be cooler than a Slush Puppy cocktail in an ice hotel, it hasn't always been that way.

In fact, some of the decisions it has made have been downright disastrous. Here are just seven of them.

1. It used too much proprietary technology

Apple's determination to do things differently has often cost it customers, cash and cachet. Early Macs were stuffed full of proprietary connections, formats and programs. Examples? It was virtually the only company to adopt NuBus expansion card slots (Steve Jobs' NeXT computer was another), when everyone else was plumping for PCI. And it ensured file incompatibility with Windows PCs by using Group Code Recording (GCR) for floppy disk media; while the Windows world was using Modified Frequency Modulation (MFM). Other proprietary technologies primarily adopted by Apple include SCSI, ADB and LocalTalk - the Mac equivalents of serial, parallel and Ethernet ports on a PC.

2. It took a big RISC with its processor tech

In the early days of computing, coming up with standards you hoped would become industry practice was commonplace.

When Apple adopted Reduced Instruction Set Computers (RISC) chips supplied by IBM and Motorola for the Mac, a lot was made of their superiority to the Complex Instruction Set Computers (CISC) chips pioneered by Wintel.

RISC architecture has the ability to carry simple instructions in a single processor cycle (Hz), while CISC architecture carried out complex instructions across multiple cycles.

In other words, a CISC-based PC needed a lot more processing power to achieve the same result as a RISC-based Mac.

RISC CPUs had numerous other advantages over CISC: they consumed less power, ran less hot and so were better suited to laptop applications. The downside, of course, is that RISC chips weren't very widely adopted, partly because Intel was big enough and powerful enough to plough on with CISC regardless.

Intel also won the marketing battle between the two architectures - by measuring processor prowess in megahertz or gigahertz, Intel chips were always going to sound more powerful. Which would you buy: a Mac equipped with a 1GHz PowerPC G4 CPU or a PC with a 1.7GHz Intel Pentium 4? The PC, obviously, even though in practice they both benchmarked the same.

By 2005, it became obvious that Apple simply wasn't big or powerful enough to demand faster, better chips from IBM or Motorola - and none of the three could match Intel's R&D. Result: Apple jumped on the Intel bandwagon and hasn't looked back since.

3. It lost the plot in the 1990s

For five long years between 1990 and 1995, Apple drifted rudderless while Microsoft and Intel carved up the PC market between them. What went wrong? Everything! Apple employees seemingly forgot they were working for a company that had to sell products and frittered away its cash pursuing ideas it hardly ever put into practice.

Apple's reached its nadir in 1995 when it had over $1 billion worth of orders for the new Power Macintosh and no way of supplying them, and a chronic over-supply of PowerBook laptops it couldn't find any customers for. Apple's problems were so bad that you couldn't mention the company without attaching a 'beleagured' tag to it. Time summed up Apple's situation best in 1996: "One day Apple was a major technology company with assets to make any self respecting techno-conglomerate salivate. The next day Apple was a chaotic mess without a strategic vision and certainly no future."

4. It became synonymous with over-priced, under-performing PCs

One of the greatest myths about Macs today is that they cost way more than their PC equivalents, when a direct spec-to-spec comparison between the two often proves that is not the case. But the myth persists because that was exactly the situation in the 1990s when Apple churned out a succession of indifferent computers that costs hundreds, if not thousands, more than their competitors. The Macintosh IIfx was a great example. It would have set you back between $9-$12K.

5. It had no answer to Windows 95

In 1992 Apple took Microsoft to court over the 'look and feel' of Windows and its similarity to the Mac operating system. Apple lost - chiefly thanks to an ambiguous contract it had signed with Microsoft some years previously.

The court case set Microsoft free to develop its OS, eventually resulting in the release of Windows 95. Apple saw the threat coming, but simply couldn't answer it with any weapons of its own.

Apple certainly tried - Copland, announced in 1994, promised many features common to modern operating systems including protected memory and pre-emptive multi-tasking. It was abandoned two years later.

Increasingly desperate, Apple contemplated buying up BeOS and even licensing Windows NT, before eventually settling on NeXTstep - a Unix-based operating system created by NeXT - a computing company founded by ex-Apple co-founder Steve Jobs. NEXTstep became the basis for Mac OS X.

6. It licensed the Mac OS to third-parties

One of the biggest mistakes Apple made during its lost weekend in the 1990s was to license its operating system to third-party PC makers.

The reasoning seemed rational enough - Microsoft got big by licensing its operating system to PC makers, why not Apple?

The problem was that Apple was both a software and a hardware company, so all the Mac clones made by Power Computing and others did was cannibalise demand for Apple's own computers.

Steve Jobs kicked the clones to the kerb when he became interim CEO in 1997.

7. Apple forgot about its core business

Apple offered a bewildering array of products in the mid-1990s, ranging from over-priced, under-performing Macs (see above) to laser printers, digital cameras and even a touchscreen PDA in the shape of the Newton - most of which lost money.

Steve Jobs canned practically all of them in favour of the new iMac in 1998, cutting the crap and renewing Apple's focus on making great computers.