ArchiveChain (or maybe AlexandriaChain?)

HappyOwlCEOHappyOwlCEO Member Posts: 17 ✭✭
Just a rough idea for an ethereum blockchain - wanted to see if it was interesting to others and worth further thought.

ArchiveChain - keeping history honest
I've noticed a trend around significant newsworthy events that the initial reporting of it doesn't always match with the official story - often resulting in the initial reporting getting edited or removed from the internet. Sometimes this is because of faulty reporting and research, sometimes it may be to support a "cover up" and sometimes it happens without us ever really knowing why.

The ArchiveChain would be a place where the social media reporting of historical events can be preserved forever, prevented from being edited after the fact. Sometimes the last version of a story is the most accurate, but I believe people should have the ability to see the first version if they so chose.

Users pay to start an archive of a particular topic, for example: "Kiev", decide how much granularity they want and how long they want the archive to run for and the software begins to monitor the social networks for that topic, scrapes for just content and embeds the results into the blockchain. Related Vine, YouTube and Vimeo videos can get hosted on servers owned by "the ArchiveChain Foundation" but shared via bittorrent for decentralization and speed purposes.

The UI for the finished archive (or, i guess, even while it's continuing to archive) can look like a word cloud that evolves as you move along the time axis (words grow and shrink or change color to denote their popularity over time) and, depending on the amount of granularity requested, one can zoom all the way down to a single tweet or FB share, even if it wasn't "popular".

Granularity determination is how the problem of "too much data" would be dealt with - if someone just wants a decent overview of the prevailing trends around a topic, the data that would get stored in the blockchain would be just the most popular ideas and sentiments, which wouldn't ultimately take up too much data. If one wants a very detailed and granular preservation of a topic, one would have to pay quite a bit more for it, since the resources required to do so would be quite a bit higher - more pay for the archive means more miners sharing the workload.

@rtaylor said he sees it as "a fact checking service, like snopeschain" and that it should have optional parameters to "control source scope and reliability" in case a journalist wants to include data from blogs and other media sources. He also suggests that "Universities could maintain archive chains, too, keeping the information free, up-to-date and accurate. It would be cheaper and much more future-proof than their current libraries."

He also suggested AlexandriaChain, in memoriam of the Library of Alexandria where it is believed that lots of world knowledge was lost when it was burned to the ground by the Romans (and others).

Thoughts?

Comments

  • StephanTualStephanTual London, EnglandMember, Moderator Posts: 1,282 mod
    Good point, the other day I was using the Internet WayBack machine (http://en.wikipedia.org/wiki/Internet_Archive), and thought how much of a shame it would be for all this valuable information to disappear.

    Studying decentralization really makes you realize how so many important resources are inches away from being wiped out. The foundation could fail (through bankruptcy, internal decent or scandal), the storage layer could be hacked etc.

    Once you wrap your head around the concept of decentralized storage it's impossible to look at even the best disaster recovery strategy and not see how fragile these actually are.

  • HappyOwlCEOHappyOwlCEO Member Posts: 17 ✭✭
Sign In or Register to comment.