Move over Ethereum: New functionality for Bitcoin Cash makes it a Smart Contract Contender

Smart contracts.  It was dubbed Blockchain 2.0.  (Blockchain 1.0 was cash) It held all the promise of a new world, a new digital frontier.  It was to herald in an age with broker-less deals, robot escrows, AI oracles, and driverless automobiles being their own corporations as self-reliant actors in the new digital economy.  An economy which did not discriminate between true born humans and machine code born automata.

That was the dream.  That was the promise.  That was what everyone spoke about for the last 4 years.  Except that that never happened.   Oh, there were many attempts.  Some achieved some modicum of success, some less so, even others ended in full blown multi-million dollar fraud or theft.  (Yes, I’m talking about most of the projects in Ethereum space, especially, but not exclusive to, the DAO.)

Let’s talk about Ethereum for a bit, as it is the blockchain with the most activity in the Blockchain 2.0 space.  Arguably it drew away most of the Bitcoin developers after its launch in 2015 as the blockchain built for smart contracts and other programmable money uses.  But at least half of its success is due to the fact that Bitcoin around that same time suffered some pretty big crippling self-imposed limitations that would all but exclude it from being a contender for the mantle of programmable money.  In fact, Vitalik Buterin, the founder and spiritual leader of the Ethereum movement, was originally a bitcoiner, and he was only created Ethereum because the Bitcoin core developers at the time deliberately went out of their way to disable many of the functionalities which would allow for a programming language for smart contracts to be done on Bitcoin itself.  So Vitalik did exactly what any good decentralist did when he was faced with oppression by the established regime.  He left and did his own thing.  He went and started designing Ethereum.  This was 2013.

However, because he had to build it from scratch, or perhaps because Vitalik didn’t have the same insights as the Satoshi did, he approached the design of Ethereum in a pretty naïve fashion.  He wanted a turning complete language so that it would be easy for developers to write smart contracts.  But a turning complete language would mean that infinite loops would be possible, which would be a bad thing in a globally decentralized blockchain.  So he resolved that using an economic protocol cost that would be applied to each computational step, so that you would need to pay per operation, and programs gone amok would run out of ‘gas’ and thus stop execution.  But this introduced a whole new category of complications: how much would each operation cost relative to others? Relative to the total computational capacity of the whole network? How would this scale as time went on?  He then went on to  ‘solve’ this new problem in a way which added even more complexity, and thus, yes, opened up a new class of problems.  He decided that the protocol should just change the rates every so often, by edict given by the outside world.  Miners should be able to decide what the gas prices should be and magically come to consensus on it heeding the advice of the senior ETH core developers –it was the ‘central bank’ approach.  Economically speaking, Ethereum was already becoming much more complex than Bitcoin, and writing and testing smart contracts could sometimes get costly, as your bugs will burn away your ETH as you make mistakes.

Scaling issues

To further the issues, Ethereum has some serious scaling hurdles.  You may have heard of how one wildly successful application on ETH called “Crypto Kitties” nearly melted down the entire network several times in the past due to txn flooding.  How?  It’s a very addictive digital card collecting and trading application, where ‘digital breeders’ can make their own unique kitten mutations and sell them for ETH.  Once many people start using the application at the same time, the network floods with transactions and the whole blockchain slows to a crawl.  But why?  Because the designers of Ethereum took another naïve approach to the problem of STATE and STORAGE.  Basically if you are going to have to run programs on the blockchain, then the code for the programs and it’s interim state, (the memory of the program has as it moves from instruction to instruction) is all stored on the blockchain nodes itself.  Which is to say, EVERY ETHEREUM SERVER is storing EVERY PROGRAM’S STATE.  That’s a lot of wasted storage.  Especially for people who really don’t care for digital kitten mutation as a past time.  And what is worse, every Ethereum server is also doing all the calculations for the Crypto Kitten decentralized application, even if you are not using it.  Basically, when Vitalik says Ethereum is a “World Computer”, he means it is a very, very inefficient computer, because every computer in the world, is executing the same code, and storing the same data, as everyone else, at the same time.  Yeap. Talk about the naïve approach.  It is pretty much the MAXIMALLY naïve design to decentralized multiparty computation.  _Have everyone do every computation_!  No wonder they have such a doozy of a time trying to scale Ethereum past the point where one popular application can wreak havoc on the network.

Well now, why do I bring up all these criticisms on ETH?  I’m not trying to throw cold water on their party.  In fact, I have great respect for Vitalik and many smart contract developers that I have met and know as they are truly breaking new ground in the space, and it is on the shoulders of their hard work that we will carve out the path to the digital frontier of the future.  However, I do want to bring up Ethereum’s fundamental design flaws because they will soon have a worthy competitor.  No, it’s not another complication smart contract blockchain, hatched out of the desire to make the founders rich. (there are _many_ in this category).  It is in fact, the sleeping giant, the original, BITCOIN.  But how you ask? How is it possible that now it can perform as a solid foundation to smart contracts but it couldn’t before?  Did Vitalik miss something? No, he didn’t.  Because the Bitcoin that he left is still stuck exactly as he left it back in 2014.  We are, of course talking about Bitcoin Cash, the offspring of legacy Bitcoin that decided that hard forks were an upgrade mechanism and that it would be OK to grow the network and add new or re-enable old features on it.

It is exactly the latter that will usher in the new age of smart contract development.  On May 16th 2018, BCH will be hard forking as part of their scheduled 6m update schedule, and one of the most exciting things that will be changed in the upgrade is the re-enabling of some of the old OP_CODES which were disabled by core developers out of fear that they may be insecure or open up attack vectors on the network back when the codebase was immature, and the network very small.  For the computer scientists reading this, the interesting instructions are OP_CAT and OP_XOR.  (concatenate, and logical XOR).  I won’t go into why these are very important, but if you are interested then you can read about how Bitcoin is effectively a Turing machine. This means that arbitrary calculations can be done on Bitcoin, using a method that separates the DATA and CODE from the proof of execution.  For the technically inclined, the analogy would be the Bitcoin blockchain transactions effectively becomes a micro instruction table, a set of CPU registers, and a program stack pointer.  All the data, the code, and storage is elsewhere.  This makes the Bitcoin model much simpler than the Ethereum model (store and compute everything on the blockchain nodes).  It’s such an elegant solution, that one wonders if it was always meant to be this way, designed by the original Satoshi, but somewhere along the way it just got derailed.  And why not? Everything else about the Bitcoin design is fairly simple and straight forward.  Coming up with it required several leaps of intuition, but when you read it the solution is surprisingly obvious. (One could reflect on the similarity of this “difficult to come up with, but simple to verify” method as the signature paradigm of the whole Proof-of-Work and hashing model itself. Indeed it seems Bitcoin is itself self-referential, or at least self-consistent)  Recall the original whitepaper was only 9 pages long.

So where does this leave Ethereum post May 2018?  It is anyone’s guess.  Ethereum still has several years of head start on Bitcoin Cash.  It has several custom languages that developers can use for writing smart contracts.  Bitcoin still only has its original SCRIPT, a language that is akin to programming on an HP calculator. (it is similar to FORTH).  But now that the missing OP_CODES will be brought back, that means that more high level languages can be built that can compile to low level Bitcoin SCRIPT.  I foresee a rich ecosystem of smart contracts and languages for developers to be built on top of Bitcoin in the years to come.  And of course, when I say ‘Bitcoin’ I mean Bitcoin Cash, the only Bitcoin that can be upgraded on-chain.

 

/EOL

In my next article I will talk about Ripple and its potential future, given its current strategy as a interbank currency payment system.

 

 

The WEB just ate my computer!

Remember those of you who are old enough, that once upon a time computers were not connected by default to the internet.  When things like token ring LAN and Novell Netware were tools that only companies could afford, and when emails didn’t exist and when you wanted to write somebody a memo, you fired up WordPerfect, write it up, print it out on your dot matrix printer, tear off the perforated holed edges, and handed it to your secretary.

Remember back before Steve Jobs (God Bless his soul) and computers were not connected by default?  Those were the days where applications were written for the computer that they ran on and software portability was an arcane and complex art.

Remember back then IBM had the foresight to realize that portability was a thing that needed to be addressed and thus they had big dreams about Java being the interface layer that would make the dream of “write once, run everywhere” come true?

They even developed ‘net terminals’ that were only running JVMs on them so that they could run any app that was a java app.  This thin-clients bandwagon was jumped on by many hardware manufacturers as they saw chance to sell a new hardware platform that could compete with the dominance of Intel, but they were restricted due to the limitations of the JVM, lack of applications and network bandwidth.  The idea was to put all the applications on servers, and then download them through the LAN to your Net Stations to run.  All storage of apps and data were to be stored back on the companies servers.  Clients were dead. Servers were to run everything.

Good ol’ Big Blue with another technology too early for its time

 

Whatever happened to that?

 

Simply put IBM, the research firm which they had become had pulled another technological whimsical gizmo out of their hat which was way too early for its time.  The world had nary time to get used to the advent of the World Wide Web, and there was IBM already trying to remove local storage from the computer.  This is bound to fail.  This was done at a time when the world had not become accustomed to software subscription models as yet, nor SAAS, IAAS based cloud computing.  The world was still based around monolithic native applications and segmentation of software by hardware and operating system camps and open source software was still relatively new. The best software was still proprietary ones.

If we look back now, the dream of write once, run everywhere has been realized.  Not by Java, or IBM, but by HTML, and Javascript.  The open internet came buy and ate their lunch.  HTML5, JS, PHP, Ruby, Python, front end frameworks, filled the gap, and made GUIs simple to write.  Javascript is now vastly more popular than Java.  Why?  Was it a failure of object oriented programming? Of compiled languages having good free GUI toolkits? Was it a lack of supporting services such as cloud storage and cloud computing that made storing data remotely so unwieldy?  Whatever the reason, it was a brief glimpse of the potential that would start the advent of cloud based services.  The big difference being that it would not be controlled by established technology companies like IBM or Novell or Oracle, but by internet companies.  Nowadays, machines need local storage less and less, with services like google drive, dropbox, microsoft onedrive.  The conversion over to ‘thin client’ Net Stations is complete, but in a decentralized way, thanks to tech like Linux and FOSS, and companies like Amazon, Dropbox, Google, Mozilla and Microsoft.

Call me old fashioned, but I think I’m still a bit reticent about having a local computer that doesn’t have any local storage, somethings you just need to have locally, such as secure applications or data that you want to stay encrypted under your exclusive control.  But more and more, I’m finding that the data produced through the course of normal daily work, office documents, PDFs, contracts, code, memos, notes, emails and the like need not be local.  These seem to be best stored in the cloud, so that multiple computers at home and abroad and in a pinch, my mobile or tablet can access it.  More and more I’m finding that my music collection is also in this category.  Even family pictures are now stored in the cloud.  How much of our data do we actual control and own?

Did you also notice how much more time you spend in your browsers vs stand alone applications in recent years?  Even Office apps are usable on the web with features that match those in standalone apps.  I think we can safely say the age of buying software in a box is over, and everything now is totally connected to the internet.  Whether we like it or not, all our data is belong to the internet.

This means data privacy is going to be more and more of a heated topic in the years to come.

The internet, is now the TV/Radio/Video Collection/Photo Album/Bookshelf for the generations to come.  I welcome our new robot masters.

/EOL