Descentralized(P2P) app marketplace using Ethereum and Singularity.

harveybc0harveybc0 Member Posts: 29
What do you think of the marketplace for P2P applications based in Ethereum and the Singularity architecture?

If any programmer wants to start a program or evaluate its adaptability to a P2P working please use the description of the minning process and the object data model at:

http://forum.ethereum.org/discussion/882/congratulations-humanity-you-have-reached-singularity#latest

Sugestions, questions welcome.

Comments

  • JasperJasper ✭✭✭ Eindhoven, the NetherlandsMember Posts: 514 ✭✭✭
    To be honest, it is hard to follow, could you rehash the idea neatly?
  • harveybc0harveybc0 Member Posts: 29
    edited May 2014
    Yes, every P2P application has to split the minning process in 2, a part for mining Ethereum and other part(larger part) for a generic application wich measures the CPU consumed to process a job and the minners receive the ethereum coin based on that measurement in a way configured per application (a large quantity compared to the mined part).

    In singularity there is a shared data object called category wich has some fields wich make it scalable like connections with other categories.

    For example in a distributed social app, the data object is created for each user and contain several chains of blocks (for messages, for photos, etc..) those example blocks can contain encrypted data wich the user do not want some other group to see and cryptographic keys to see the info from other users.

    If i was not clear, please askme for other example.
  • harveybastidasharveybastidas Member Posts: 20
    edited May 2014
    Not all the data has to fit in the category, the chains of app data blocks can be links to other shared resources, and i know there are other alternatives for social P2P apps like:

    -http://p2pframework.com/?page_id=6&lang=en
    -http://www.bitgroup.org/

    But i want to use the bitcoin minning process efficiently for this apps (the Singularity minning job may include processing, storage and transmission works).

    Please tellme if is not what you needed to know.
    Post edited by harveybastidas on
  • harveybastidasharveybastidas Member Posts: 20
    edited May 2014
    Not all the data has to fit in a category, the chains of app data blocks can be links to other shared resources.

    Other example of the use of the categories is the IA experts app described in the Singularity forum thread.

    Yet another example is the marketplace app.

    Every program in the marketplace is a shared data object as described before and contain chains of blocks for the history of transactions with it wich can contain information on sales, code changes, interfaces with other apps, usage metrics, ratings, etc...

    The protocol i am using for the management of the shared data objects is DHT (the used for bittorrent).

    http://www.bittorrent.org/beps/bep_0005.html

    Please tellme if is not what you needed to know, if you have another implementation approach it will be appreciated.
    Post edited by harveybastidas on
  • harveybastidasharveybastidas Member Posts: 20
    edited May 2014
    The miners can use the measurement of resource usage of distributed technologies like DC Apache, hadoop or BOINC(private network of miners), so the programers can make applications using the miners resources with all the tools that are normally used to build Web Apps and all the coin transactions managed with Ethereum.

    http://hadoop.apache.org/
    http://www.cs.arizona.edu/projects/dc-apache/
    http://boinc.berkeley.edu/
  • JasperJasper ✭✭✭ Eindhoven, the NetherlandsMember Posts: 514 ✭✭✭
    I must say, this feels more like a stream of conciousness than a clear presentation to me. Other people dont have access to the internal state of your mind, you know. Possibly it is better to hit draft and come back later.

    Well i highly critical of Gridcoin, which uses BOINC. And other things you link in this context. It is not decentralized, merely distributed; it depends on a central point to 'vet' the computation, and that central point doesnt even seem to have a neccesarily effective way to vet it. The computer itself cannot 'measure the CPU usage' either, because it is easy to just fool it to report anything.

    The other things are fine :) but.. i dont see how they fit with your idea.
  • harveybastidasharveybastidas Member Posts: 20
    Yes, you got ir right, is the same mechanism as Gridcoin, but instead of contributing to the BOINC public projects, You use BOINC in a private network for the minners or use any other Distributed computing/Distributed content distribution or Descentralized storage instead of the BOINC part in gridcoin. So the computing part becomes "personalizable" and not only contribute to the public projects.

    The CPU, Transmission and Storage/time must be indirectly measured by the miners, for example in the IA experts the CPU required to get an efficiency increase (not contunuous)is aproximated by the population size, the kolmogorov of the genomes and the size of the dataset.
  • harveybastidasharveybastidas Member Posts: 20
    edited May 2014
    I found a problem with the change of the proof of work, and as @Jasper commented in the project thread the security is important and the security of the "proof of work" is compromised if the payment to the miners is done only by indirect measurement of CPU/Storage/Transmision costs.

    The marketplace and the social app can be done P2P using existing technologies and the category system of Singularity (look previous comment) and can use Ethereum for the rating system (mainly).

    But the minning system is very difficult to addapt to a generic CPU/Storage/Transmission platform without compromising security.

    The only way i can make the "proof of work" more useful is to use AI genomes for consensus as described in the posts of the project thread because the expert's efficiency increasses in time and is easily verifiable by the miners as all have the same training dataset, so a hacker can't make a random genome that has more efficiency for the dataset and if he did, it would be a good thing.
    Post edited by harveybastidas on
Sign In or Register to comment.