Sometimes it takes a while before technology catches up with ideas: for example, Microsoft was pushing tablet computers and smartwatches nearly a decade before the iPad and Apple Watch respectively, but the tech wasn’t quite up to the job.
Other bright ideas fell by the wayside for more prosaic reasons, whether that was industry lobbying or just good old-fashioned incompetence.
In fact, some of the best ideas we have in technology didn’t become successful products in their first incarnations – sometimes it takes a few failed attempts before the successful formula for a world-beating idea emerges.
You’d be surprised how many of the world’s noted tech firms are in this list – just because they’re good at what they do now, doesn’t mean they always have been…
This article is brought to you in association with Vodafone
Imagine being able to download documents and apps over phone lines, and have those documents and apps link to one another.
You’d be able to create programs without coding, trigger multimedia events and analyse data with ease. What would you call this 'web' of connections, available 'world wide'? That’s right – you'd call it Hypercard.
Hypercard had an influence on the world wide web, but it existed long before; rather brilliantly, Apple employee Bill Atkinson began development of it after a particularly memorable LSD trip.
He began work in 1985 and the first release was in 1987, free with Apple Macs. It was hugely popular but it didn’t seem to have a defined target user – and it didn’t help that Apple might have seen Hypercard as a possible competitor to its own software.
Apple ultimately hived off its software business as a standalone business, where development of Hypercard stalled and it never regained its initial momentum.
Immortalised in the film Who Killed the Electric Car? General Motors' EV1 was based on a 1990 concept for an electric vehicle. It went on sale – or rather, on lease – in 1996, but manufacturing was discontinued just three years later.
One of the reasons was that the EV1’s reason for being had changed. It had been developed to meet strict new regulations demanding car firms offer zero-emission vehicles in California, but car makers’ lawyers neutered those regulations.
Meanwhile, General Motors felt that electric cars were an enormous money pit. Unprofitable and at the time unnecessary, the whole programme was canned in 2002, and the vehicles repossessed – most were crushed.
Before Facebook there was Friendster, but before that there was LiveJournal. Started by US programmer Brad Fitzpatrick in 1999, it was originally designed for high-school friends to stay in touch, and combined blogging and commenting to great effect.
It grew like wildfire, and was so popular that new sign-ups had to be restricted because the servers couldn’t cope.
But after its sale to Six Apart in 2005, LiveJournal introduced something it said it never would: ads. And then LiveJournal essentially started doing what some accuse Twitter of doing now: messing with key features and designs without consulting users, ignoring user complaints, then suddenly backtracking.
For many the final straw came in 2007, when a purge of dodgy content wrongly identified and deleted hundreds of users’ accounts – not once, but twice. Today, LiveJournal is usually spoken of in the past tense.
Apple’s second appearance here is the Newton, AKA the MessagePad; long before the iPad was a glint in Steve Jobs’ eye, the Newton offered a portable computing experience that some believed would harm Mac sales.
It was a superb product, but it was expensive – and its headline feature, handwriting recognition, wasn’t as reliable as the ads suggested.
Sales were poor, and when Steve Jobs returned to Apple in 1997 the Newton was one of the projects he killed.
But while the product was canned, the underlying philosophy wasn’t: 10 years later, Jobs would unveil the iPhone, a portable computing experience that would soon dwarf Mac sales.
Microsoft doesn’t always get the credit it’s due for innovation: its Tablet PC might have been clunky as hell, but it was on sale for the best part of a decade before the iPad.
And that wasn’t the only product Microsoft made too early – its SPOT watch predated the Apple Watch by 11 years, although unlike Apple’s timepiece, it was canned after just two years.
The problem? A combination of price, lack of performance and a questionable decision to build a proprietary FM radio network instead of using existing mobile platforms to communicate.
If you ventured beyond the city, your SPOT watch’s coverage became awfully spotty.
One of the reasons Microsoft doesn’t always get the praise it deserves may be because of the way it over-hypes some things.
The Courier is a case in point: this fantastic, exciting, must-have tablet was never a shipping product, and while its existence was first reported in 2008 the project was shuttered in 2010.
This was a real shame – a device that used a digital pen in a form factor that resembled a true notebook was an idea that caught the imagination of many (TechRadar journalists included), but it never saw the light of day.
Sources say the Courier fell foul of internal politics – something that had affected the Tablet PC too – because its modified, email-free version of Windows didn’t fit with the company’s strategy for Office, its cash cow.
Microsoft opted for Windows 8 instead for the tablet world, and is still soldiering on with Windows 10 in its Surface range.
PSVR, Oculus and Vive are standing on the shoulders of Mario – Nintendo was offering VR-esque controllers back in 1989, and 3D headsets in 1995.
The 1989 Power Glove for the NES wasn’t a huge success, though, because only two games were made for it, and sales struggled to reach 100,000 units in the US (although that actually seems pretty impressive given the limited titles available).
However, the motion-sensors within it would go on to inform the Wii controllers and other handheld motion trackers. As for the Virtual Boy, it was a commercial failure as well: too expensive, too limited, too unimpressive, and too few decent games – this was almost the ultimate 'too soon' product.
Who knows what would have happened if Nintendo had waited, and made a real go of VR in the current climate?
Apple often gets all the plaudits for 'reinventing the smartphone', but the Palm Pre was a lovely thing, and a quality rival: its WebOS operating system was gorgeous and slick, it had both touchscreen and keyboard input, and it was loved by reviewers.
In many cases it proved the equal of the iPhone 3GS, but it fell down in three key areas: build quality, operating system and apps.
Too many Pres suffered from basic hardware failures – cracking screens, faulty headphone sockets, buttons that stopped working – while updates to WebOS were few and far between.
More than anything, though, the Pre was up against Apple’s new-found enthusiasm for third-party apps, the number of which which burst through the 100,000 mark in late 2009, while Palm couldn’t muster anything close to that fervour with developers.
Maybe if it had kept its original name – the thoroughly excellent MCA DiscoVision – then LaserDisc might have been a bigger deal.
The first North American commercial release, Jaws, was back in 1978, and LaserDisc was also used to power cinematic arcade games such as Dragon’s Lair.
However, while the technology was impressive – and offered far better picture quality than videotape – the prices weren’t; typical releases cost around $39.99 (£30 / AU$50), with special editions costing more than twice that.
Factor in the cost of a player – $300 (£225 / AU$375) for a cheapie, $1,500-plus (£1100 / AU$1850) for a high-end one – and LaserDisc was never going to be a mass-market sensation. At its mid-90s peak, LaserDisc was in just 2% of US homes.
Sony’s robot dogs were 'entertainment robots', and were sold from 1999 until 2006, when the Aibo was admitted to the Carnegie Mellon University Robot Hall of Fame as 'the most sophisticated product ever offered in the consumer robot marketplace'.
Sony saw the future, but unfortunately it wasn’t profitable enough for a corporation undergoing severe financial problems; with just 150,000 AIBOs sold, Sony decided to focus on more lucrative products instead.
The company ended support for the robotic mutts in 2014 when it closed its last Aibo repair center, but there’s a thriving market in Japan for parts and repairs, because Aibo owners really, really love their electronic pets. The New York Times has a wonderful, heartbreaking video featuring Aibos and their adoring owners.
Rebranded MSN TV when Web TV Networks was acquired by Microsoft, WebTV was a set-top box that turned a TV into a web browser for viewing websites and accessing email.
It wasn’t, and didn’t try to be, a complete alternative to desktop PCs; it was designed specifically to deliver straightforward and cheap internet access for people without PCs.
It launched in 1996 as the very first consumer device for accessing the world wide web, and while initial take-up was slow it had 800,000 subscribers in early 1999. It also managed to deliver some cutting-edge features combining TV viewing and the internet, as well as scheduling video recording in a similar way to modern DVRs.
The service ran until late 2013, by which time it had long been eclipsed by web-enabled consoles, TVs and other devices as Microsoft began absorbing the tech into other parts of its business.
The 1981 Xerox 8010 Information System, also known as the Xerox Star workstation was revolutionary: it introduced the concept of a desktop, gave us the graphical user interface, used a two-button mouse, delivered What You See Is What You Get document editing (that we all use today to, you know, write stuff) and included Ethernet networking.
However, it was beyond pricey: the introductory cost was $16,500 (around £12,500 / AU$20,000), which works out at around $43,000 (around £32,500 / AU$50,000) in today’s prices. And while all computers were expensive back then, the Xerox was still considerably more expensive than others – a Commodore VIC-20 was around $300 (around £225 / AU$375)
Other firms, most notably Apple and Microsoft, would go on to emulate the Star’s interface with considerably more success, creating the operating systems we know today.