Radix Economic Engine, Technology

 

If today we could avail ourselves with the real testimony of at least 1 Byzantine general ( Diarix from the battlefield with the army of Byzantium) by Andrea Ciacchella) of that time, surely we will find him astonished and not only for the leap of more than 1500 years.

This is because the human enterprise of preparing technology to solve the dilemma generated by the same human variability, concerning the exercise of betrayal or obedience to an authority, has developed in the last decade so many of those variants, that the poor man would remain equally without certainty and a solution, for a while longer.

But avoiding to delve into the ethics of human nature, always waiting for the AI to intervene where the hand of man vacillates, at the very moment, different solutions have emerged to put the uncertainty on the “automatic consent” and to slide the world economy quickly, in search of the most customizable product possible.

One of the companies, which in my opinion, has gone far beyond the development of the bitcoin protocol in recent years, is RADIX DLT.

From Newcastle UK, Dan Hughes, the founder and his team have been for six years working in a muted, relying on its economic forces, knowing that if they came out with a brilliant solution, they never would have been able to influence the choice of technology for the implementation of the consent automatically distributed and governed by Smart Contracts, for the next few years. When in 2018 they made public: code and business proposal; there was not a big stir, in fact, someone threw it there as another boring blockchain architecture, turning the spotlight away from this project, which instead is worth trying to understand, in the innovation it presents.

Let’s start by pointing out that Radix is not a blockchain and not even a DAG (Direct Acyclic Graph ).

The approach to this technology seems to trace in broad lines the study that took place to give a name and a reason to the very life that we interpret today, that is, the discovery of the smallest element existing: (until a few years ago .. ) atom. And that of the universe that encloses everything. From Democritus to Hubble passing through Dalton to Rutheford, through the Life Of The Atom is formed matter and so it must also be for the DLT of Radix.

The Atom is created and fired in the network, precisely in a universe of time, the latter defines the Space-Network where the nodes will be present. The Node is nothing more than a computer or device that has the Native Client software installed. Atoms are of 2 types: payload and transfer. Time just happens, a new event ( transaction or message ) an Atom is to be generated, and will: 1 address of end-point of default, to make it indexable, and the Private Key of the operator, a series of data and metadata for the owner and to the object of exchange, a currency amount ( if consists of a transfer are requested ), the recipient of the exchange and signature of the operator.

An atom could also contain other atoms, as long as the total weight does not exceed 64kb, which is the max allowed to support the speed ensured in sending the transaction or message to other nearby nodes. Always through the nodes, you come to store the fragments (shards) which is nothing else the physical structure constituting the database of Radix. These fragments, due to their characteristics, small in size, despite the presence of a number is calculated in 2*64, allow the indexing to perform a relatively simple work to locate the required data in the fragments related and traced back to all possible owners Nodes and previous for where they are passed, in virtue of the fact that the Atoms of transfer that make a little as a porter are replicated over multiple Nodes to maintain the usability of the information they contain.

But we come to the operation of this database distributed through fragments. At the moment that an operator constituted in a node ( it is assumed that it will have to identify itself and once it has done it will not be able to change its coordinates), creates and sends an atom where they are present: its ID, the Private Key, the signature, the good or service that it wants to exchange ( if it were currency and. ) and finally the recipient of the transaction. Automatically you are to create the different Atoms of the transaction containing: the above information in the Atom load (payload ) + the characteristics of the object ( in the state primary at the moment that is loaded on the Network), and these Atoms will seat in the Nodes proximity, thereby generating the phenomenon of Gossip, which is the splitting of the Atoms to transfer into smaller particles, but always respecting the necessary training, because with the end-points required for indexing.

Strictly speaking, to ensure a higher level than the current security standards, each node is provided with a logical clock that records at each event = creation of Atom, A Parallel Time unit within time (a kind of badge ) and only one at a time. Every time an event is created, the script automatically performs a Test of time, adding a date, to confirm that the Atom has the consistency demonstrated by the particles previously shot to the Nodes of proximity, but present in more fragments that are not due, to the reverse, in the case of a “double-spend”, for any other Atom, to the fact that the date affixed is always unique concerning an event, and all other evidence of the temporal present in the Atom will always be earlier than the created function.

For all intents and purposes of the temporal test, if by constitution it is not part of an atom, is, in turn, atomizing to be transmitted to the surrounding nodes by forwarding the request for validation of the transaction in progress. While all this happens, the required data are agglomerated to form a set of data assembled on location, impression date, a hash of the atom at creation, finally any observer nodes.

When the validation protocol ascertains that there is consistency at 51% (why go beyond spending time …??? ), the transaction will be executed. Another atom with the exchangeable asset Xa will be stored in the distributed ledger fragment, with the ID of another owner of the exchanged asset.

In this first part of the activity of exchange of atoms between the various nodes, the absolute prominence of the same nodes appears evident, with the following question that I consider legitimate to ask: “Who will be behind the nodes?”, “Will it be allowed to create consortia, with a limit of how many nodes each? “, “Will checks be necessary to a proposal of the intent of the operator? “This and everything else concerns the future security of the entire network, they have, the legitimacy of the world to be formulated, because of the simple equation: no computer application that develops in a public environment will be 100% safe, as long as all possible vulnerabilities are taken into account and resolved. Also because tests run in development, they will never see global solutions like when you are actually in production and, e.g. the time backup network is running, while the mainnet is down or down; that’s when you need to speed up internal decision-making to an exceptional level. No CIO will ever want to manage this unfortunate situation.

So if in the nodes lies the power to condition the free flow of events, how to predict any kind of malignant intrusion? In addition to allowing a considerable heterogeneity of participants which would allow for the decentralization of the wild, a feasible option to keep safe the Network from centralizing trends and one of the most innovative solutions in my opinion, is the introduction of a Merkle Hash validation protocol., which sanctions the constant activity of each Node in a summarizing way, like a momentary post-it, where the principle of the Temporal Test is applied as in the creation of an Atom. This kind of backup, ackup, which represent data that is sansible in all respects and will have the same characteristics as the Atom, is not yet defined as to where they will be fired, but common sense in this case suggests sending them to certified nodes and therefore protected by additional proprietary signatures; therefore an orbit of higher class Nodes, what will probably be the Core of the Radix Network.

In the following part to understand the complexities of time activity, we would have to interpret how the different computing powers will interact, according to the hardware of each device connected to the Network and what will be the incentives offered to participate. This unknown to which I refer does not exist in the structures bound to the Bitcoin protocol the major computing powers grab the bulk of the reward at the finalization of the block. In contrast, the others are left only the crumbs. An operator of these blockchains will one day have to ask himself whether to continue to lose all available margins progressively or to look for a Network where at least an acceptable margin of stability is guaranteed.

Written by Amec Radixdlt ambassador