The Many Faces of Scalability

Our biggest achievement of the past week was completing a working draft of the White Paper. We started circulating it internally. This has already been helpful in clarifying some key concepts of the technical design and the requirements on the economic concept. 

Having that said we are pretty confident about the economic model and its justification. The basic logic remains that Healid will generate dramatic value for the world. It will be a platform to lower health care costs, while making humans more healthy all the same. The Heal-ID itself will provide unique abilities for humans to control their data, across any kind of borders (technical, political, legal). This opens the door for expanding its use to other functions. 

Where Bitcoin miners store massive amounts of data (the whole Blockchain) and spend Gigawatts of energy to run enormous calculations to find otherwise meaningless nonces, we provide tangible value for data storage and calculation power. And then we have a revenue model also.  

So far, I have encountered no previously unknown challenges to the technical approach have been found. Some repeatedly questions popping up were: 

  • Why use a decentralized model while centralized models used by some of the largest entities of the world still seem superior?
    Short answer: Because we see decentralized models very quickly becoming superior, driven by decentralized motivations, as proven by the ultimate decentralized success story: the Internet. 
  • Why not use a relational database model, with its proven maintenance capabilities and overall flexibility?
    Short answer: Because centralization introduces data control and scalability risks, among others, and the kind of our data is different, limited and specific (genome data), and does not require all the manageability of traditional systems, and because we need the scalability and resilience of distributed solutions.   
  • Will the solution be scalable enough, considering that the likes of Storj is still in its infancy and Filecoin does not even have a finished product yet?
    Short answer: Because we are encouraged by the rapid rise of such solutions, see the scalability of IPFS, and liken it to the success in achieving a rapid increase in transaction per second (tips) scalability from the original Bitcoin to half a dozen of other capabilities (IOTA is just one example) – about to blow traditional VISA-level performance out of the water.  
  • How will we be able to keep the storage size requirements manageable?
    Short answer: Because during technical mining we can leverage algorithms that compare data to reference genome, consider repetitive data sequences, and structure the data in a way that optimizes storage space, just for starters. 
  • How can we minimize data transfer rates during big data AI driven analyses?
    Short answer: Because we see options to use technical mining processes to optimize the way we store and access data most effectively, like building index or reference structures, statistical aggregations and correlations during the mining process.  

It was particularly encouraging to see IOTA breaking the $1 per share mark, acknowledging the prominent role in our space that this currency is about to play. The market seems to confirm the unique progress made by that platform. 

With more analyses we now see a need for a simplified technology approach. While developing our own IPFS based file storage solution is not out of the picture, we are much more scalable if we remain “file storage agnostic.” What should count is the way these files are stored, the data format and structure, and how effectively we can access them. 

A main revelation is not a surprising one: We cannot commit to solid dates for delivering milestones – unless we have a proven Pilot in place and see how the technical capabilities of the crypto-market are evolving over the next few months. So this will need to be a high priority. 

While wrapping up the White Paper, the next week will bring a focus on our funding strategy and building the advisory team. We also will dive more in detail into our technical development and partnership approach. 

By | 2017-08-22T23:51:30+00:00 August 18th, 2017|Regular Updates|0 Comments