A university in the UK, the University of Surrey has committed to safeguarding national video archives and government records from unauthorized eyes, by using the blockchain technology and artificial intelligence.
According to a press release, the University has joined teams with the Open Data Institute and the National Archives of the United Kingdom to utilize their Centre of Vision, Speech and Signal Processing, also known as CVSSP. The CVSSP will be used to develop what is known as a highly tech-based and secure computer vision with the use of the blockchain system to preserve digital archives for centuries to come.
According to the same press release, the platform will be decentralized and given the name of ARCHANGEL.
Computer vision is a type of Artificial Intelligence which is designed to read, recognize and analyze images and videos supplied by the developers. Basically what the platform will do is, provide a digital fingerprint for the archives, which will make them much more recognizable.
It will also help to tell between an authentic piece from a forgery.
The ARCHANGEL system is backed by the blockchain tech proof-of-authority protocol, which will immediately flag the sections of the archives which have been tampered with. Should there be any modification to a specific piece, whether it was intentional or malicious, the system will display a clear message for immediate resolution. It could be manual as well as automatic.
The platform will be open source. Meaning that everybody willing to take a look is welcome to, but modification is impossible. This ensures that the records will remain authentic and preserved for future studies. It will also eliminate censorship as decentralization will protect the documents from any government influence.
The framework has already been tested by the national government archives of countries like Norway, Australia, USA and the UK. The university is hoping to globalize this project, so as to provide safer and more reliable ways of safekeeping historical records with no politically biased forgeries in the future.