The creative content studio from the Financial Times
The story of Oppenheimer is clear.
Technology itself is not good or evil. People decide whether to use technological advances to destroy or create, to further their own interests or help others. But when the things we’ve made can only be controlled by those with the deepest of pockets or when tech starts to learn and transform beyond the limitations of its makers, what happens then? Who – or what – has the ultimate power?
This was my main question when recently listening to a talk given to Alpha Grid by Kitty Horlick, founder of Blackwood, a consultancy firm which onboards businesses into Web 3.0 and its associated technologies including blockchain, non-fungible tokens (NFTs), the metaverse and cryptocurrencies.
Horlick unsurprisingly has a self-proclaimed bias towards the evolving world of Web 3.0.
While she emphasises that these technologies are nowhere near perfect and still in their infancy, Horlick believes they have the potential to take a system controlled by a handful of tech giants (our current Web 2.0) – where personal data is sold and manipulated with minimal consent – and transfer power back to the people.
But is the promise of Web 3.0 as fair and democratic as it seems?
AI is not new, but generative AI is… and it’s very unpredictable.
Generative AI doesn’t just predict, it creates, but its creations are based off of human-made work (hence the current strikes happening in the entertainment industry). Blockchain technology could be used to trace back where the work came from and provide more transparency around AI-based art.
One key thing to note about Web 3.0 and its associated tech is that it’s also (at present) incredibly energy-intensive and environmentally destructive. In 2022, Bitcoin transactions alone generated more carbon emissions than Sweden.
And just like in traditional systems, those with vast amounts of wealth still seem to have the ultimate power.
As Horlick mentioned, it would be incredibly difficult and computationally expensive, for example, to train or abuse AI models or corrupt the crypto chain. This should be considered a good thing. But it’s not impossible… just very, very costly, which means billionaire moguls who already influence so much of society can potentially manipulate and shape the Web 3.0 world.
For me, before we embrace the very real potential of digital democracy and decentralisation, we need to address the mass hoarding of money and widespread wealth inequality. Otherwise something like Web 3.0 will never be truly democratic, fair or free.
Note: some of this was written by AI. Can you guess which parts?
Want to join the team?