Blockchain is arguably one of the most significant and disruptive technologies that came into existence since the inception of the Internet. It’s the core technology behind Bitcoin and other crypto-currencies that drew a lot of attention in the last few years.
As its core, a blockchain is a distributed database that allows direct transactions between two parties without the need of a central authority. This simple yet powerful concept has great implications for various institutions such as banks, governments and marketplaces, just to name a few. Any business or organization that relies on a centralized database as a core competitive advantage can potentially be disrupted by blockchain technology.
There is a widely held belief that because math is involved, algorithms are automatically neutral.
This widespread misconception allows bias to go unchecked, and allows companies and organizations to avoid responsibility by hiding behind algorithms.
If you begin with Computer Science, you will end with Philosophy
I survey a common theme that pervades the philosophy of computer science (and philosophy more generally): the relation of computing to the world. Are algorithms merely certain procedures entirely characterizable in an “indigenous”, “internal’, “intrinsic”, “local”, “narrow”, “syntactic” (more generally: “intra-system”) purely Turing-machine language? Or must they interact with the real world, with a purpose that is expressible only in a language with an “external”, “extrinsic”, “global”, “wide”, “inherited” (more generally: “extra-” or “inter-”system) semantics?