fbpx

The Merge

by Sara Galbiati, Peter H. Eriksen, Tobias S. Markussen

The Merge explores this promising research in AI at the world’s most prestigious labs and institutions. Simultaneously the project investigates how supercomputers, artificial intelligence and robots affect our society.
By looking at interactions between man and machine the project asks how this accelerated digitized paradigm are affecting our emotional, social and moral norms.
The project employs an array of photographic approaches to document human existence as we move rapidly towards a point in history where physical and digital worlds will become so intertwined, that it will be impossible to distinguish between the two.
The exponential development towards AI singularity also raises more fundamental and overwhelming questions about our perceived reality; Is it possible that our physical reality does not exist as we believe it to, that instead, we are living in a computer simulation? Is it possible that our world is just a construct – a created illusion?
Philosophers have been questioning our perception of reality since Plato’s Allegory of the Cave. This existential discussion gained new interest in 2003 when Oxford University philosopher, Nick Bostrom published “The Simulation Argument”, which argues that life on earth could indeed be a computer simulation. Since then the academic debate has been raging and multiple high-ranking tech executives such as Tesla founder Elon Musk have publicly confessed to the theory.
The reasoning is based on the exponential haste in which artificial intelligence is developing. It will not be long before we are able to create perfect simulations of our own experienced reality. This leads believers of the theory to argue that if we can create perfect simulations – humans might only be programs inside a simulation run by others.
The Merge aims to debate the complex dynamics between humans and technology and the derived possibilities of realities. The images balance between realism and imagination.

sarapetertobias.com