Let’s say for the sake of illustrating a point that BT has 4M business customers in the UK and if every one of those customers maintained a record of BT’s address details, whether on an accounting system like Xero or just in a simple contacts database then there are theoretically at least 4M separate instances of BT’s supplier account details in use today.
Which is to say there are 3,999,999 duplicate records that in the old unconnected world of business is an unavoidable consequence of necessity. Using some dubious expert guesswork I roughly calculate that one instance of BT’s details would use around 4k of disk space and therefore by theoretical extension all those duplicate BT supplier account records would soak up a combined 14 gigabytes of disk space.But that’s just one big company that has lots of customers and clearly not every single one of the UK’s 4.7 million companies trades with each other.
So, let’s ignore the big old edge cases and instead guess that the average small business – for they make up 98% of the 4.7M – maintains 500 other companies’ customer records in their accounting systems, not including prospect lists or other databases like services, warranty, memberships etc. Using my same dubious guesswork of 4k per record, that throws out a customer record database of about 2 megabytes per company. And if all those 4.7M theoretical 2 megabyte databases were dumped onto a single hard disk, that disk would need to be a not inconsiderable 9,000 gigabytes in size – single hard disks run to about 2,000 gigabytes today.
If all UK company data were stored only once in a centralised cloud database and all systems of record stored a simple data pointer to each centralised record copy, the collective cloud data-file would be a measly 18 gigabytes.
You can get USB memory sticks larger than that today for twenty quid.
Finally, to get the absolute worst-case theoretical filesize where every UK company traded with every UK company we’d multiply 4.7M companies by 4k per company record to throw out single company’s data filesize of 18 gigabytes which we’d then multiply by 4.7M to arrive at a whopping 8.4 billion gigabytes of collective disk space.
So, a single instance of a universal cloud based database would take up 18 gigabytes and the theoretical worst case offline figure is 8.4 billion gigabytes with the true figure being goodness knows somewhere in between. And that’s before factoring in other records and transactions.
But before this blog post gets totally out of control, my simple observation is this; as we shift ever more into an online digital world whether it be systems of record in business or 10 million personal music libraries containing exact duplicate copies of a single MP3 file of a Lady Gaga track, you have to wonder if we will ever kick this thus far inescapable appetite for epic levels of database redundancy that our legacy IT systems and old world business processes impose.
I hope we do.