Congratulations Humanity, you have reached SINGULARITY

harveybcharveybc Cali, ColombiaPosts: 9Member
As the CPU computing power required for Bitcoin Generation due to the "proof of work"concept is totally wasted (and ultimately converted to CO2), I devised a method to save this computational power and use it in the form of trained AI experts (neural network instances), the difficulty variation of the puzzle in Bitcoin is repaced with the variable Kolmogorov Complexity of any of the training instances of the AI experts, and this instance is evaluated by the client for the "proof of work" generation, and to distributedly train the experts using distributed power.
You can spend a lot of GPU power to train a AI expert (several instances, several iterations), but to use the trained version, you only use the best neural network generated and little CPU resources so the advanced neuroevolution techniques are out of reach for the Arduino Uno or Raspberry Pi.
A novel modified fractal turing machine is used to comunicate neuroevolution commands and signals to field machines and internet of things minimizing transmision cost of the experts to the final user and to replicate, modify and use the best trained AI expert available in the network at low CPU cost.
This resourse is invaluable for research as interconectable, configurable connection-based parametrization inside a taxonómical distribution of AI experts can deliver advanced image analisys and data minning thad due to lack of computing power are out of the reach for some people, aditionally the modular, interconectable, nature of neural networks make possible the creation of very complex experts and the usage of the final trained expert in a prototype or mobil device.
The name comes from the fact that as all the AI experts in the taxonomy reach efficiences superior the the human counterparts by definition the machine is in average more "Inteligent" than one human, and this concept is called Technological Singularity, and i think if done correctly will bastly improve all aspects of life.

Comments, sugestions, help welcome.
«1

Comments

  • ranfordranford Posts: 25Member ✭✭
    The title sounded hype, and a bit way out, but you would be onto something grand if training/use of neural networks can somehow be used as part of the proof of work process. Can you provide more details of how this would actually work? Have you written a paper on it yet? Would love to see more details.
  • chris613chris613 Posts: 93Member ✭✭
    Yeah this is kinda hard to follow, but I think I get the gist of it. Replace difficult hashing tasks with some sort of neural network training task that has a measurable difficulty. Who would design the networks, the goals, and the overall architecture that aggregates the various trained networks into a functioning singularity-inducing AI is up in the air I guess; but the thinking is that if we poured the type of computing power into AI training as we do into finding difficult hash nonces the world would be a better place. Can't say I disagree. This needs more thinking, though.

    @harveybc‌, do you have any code or other formal descriptions we could talk about or is this purely a product of ideation at the moment?
  • harveybcharveybc Cali, ColombiaPosts: 9Member
    edited May 2014
    Yes, the test i been making is to evolve just one descentraliced expert, passing a genome wich is a training instance to the clients based on a threshold that is the kolmogorov complexity, evaluating it with inputs of the client. The distributed training system, calculate and assign the Kolmogorov Complexity for each champion of each expert in the taxonomy and for the rest of the training instances. It uses a modified version of the Neuroevolution technique NEAT. Ken Stanley, the author of the technique told me that this was an "ambitious vision", but didn´t criticized anything jejejej. About the taxonomy of experts, i have a initial escalable taxonomy of experts based in function and estructure, it will be published in NeuralZoo.com for use int htis and other projects. All the software will be Open Source. The program i have now is too ugly to publish, but i think i can have some decent (multi-expert) client the next week(or the following). Iam making Whitepapers for each procedure that migh look complex, but when you read it it is not. I am solving several details on the implementation on Etherum, but I hope that if i for some reason cant complete this work, this example will server for Rule, like a Paradigm for the inavoidable development of a Singularity. The landing page of the technology will be singularitydoor.com (i had no time to finish it) but also nex week it will be up and with all documentation and examples i´ve done. Until now i have used the Encog AI framework with C++.
    Post edited by harveybc on
  • harveybcharveybc Cali, ColombiaPosts: 9Member
    Ok reading my response there is an error that may be very confusing, the genome passed to the client is the champion(best training instance) of a taxonomic category. The Kolmogorov complexy as a measurement of dificulty to train is used as treshold to pass the adecuate training instances (neural networks) during the production of "proof of work" to a client minning Singular Coin to Process along with a piece of training dataset(this is the dificulty) for the taxonomic category (like a piece of the modified fractal Turing tape). Don´t be scared for the fractal part, actually is the easiest part of the process, it is just an iteration wich receives neuroevolution instruccions (create, modify) for each node present in the iteration in wich thanks to the modify command we can send data in the same structure of the genome for the network. It would be nice if for "the price" of using one expert, the client contributed to the training system even a little bit, am on it to see if it is fair, maybe let it optional or let it as now training only during the "proof of work" generation.
  • harveybcharveybc Cali, ColombiaPosts: 9Member
    edited May 2014
    Singularity Minning:

    I have a lot of things to define yet, but the basic process of Singular Coin Minning is as follows for now, this is no formal o final, sugestions, observation welcome:

    Each taxonomic catergory is composed by sets of genomes wich are training instances and will be trained by the miners, the genome with the best fitness in the dataset for the taxonomyc category is called champion, the taxonomic category also specify wich is the best or recomended trainimg method for itself.

    1. Each miner node collects a set of genoms. The miner only colects genoms for a taxonomical category (like only taking individuals from a single species).

    2. At the same time the miner node collects the training set for the taxonomical category.

    3. Each miner node train the category genomes using NEAT or other neuroevolution approach.

    4. When a miner node is able to increment fitness from the previous champion fitness, it broadcasts the champion's genome to all nodes.

    5. If there is consensus on the superior fitness of the champion for the taxonomical category, the miner receives a quantity of SingularCoin determined by the fitness increment and the taxonomic category complexity (training more complex experts make more Singular Coin),

    6. The miner broadcast the evolved training genomes (including the champion) for other miners to use only if they have better fitness than the existing ones.

    I know there a lot of things to define, for now am working in , but i am working on them. Due to the highly parallelizable nature of neuroevolution training, the training of a taxonomical class can be divided further betwen the miners and they will share the reward based on various factors(WIP).

    Sugestions, questions, improvements, etc?

    Post edited by harveybc on
  • JasperJasper Eindhoven, the NetherlandsPosts: 514Member ✭✭✭
    Dont quite follow, but sounds pretty speculative to me, but at least some specific things to it. Perhaps you should link to NEAT related stuff..
  • harveybcharveybc Cali, ColombiaPosts: 9Member
    http://en.wikipedia.org/wiki/Neuroevolution_of_augmenting_topologies

    If you do not fully understand what you are reading, you can't say it is speculative, please be specific.
  • harveybcharveybc Cali, ColombiaPosts: 9Member
    edited May 2014
    You can use NEAT and other very good IA methods using the Encog in C++, java and others:
    http://www.heatonresearch.com/encog

    This is a detailed info on NEAT:
    http://nn.cs.utexas.edu/downloads/papers/stanley.ec02.pdf

    Obviously you do not implement the neuroevolution algorithms at low level yourself (they are already optimized in Encog) If after reading that you consider that is infeasible, let me know and i will teach you.

    NEAT has a great characteristic and it is that the genomes used dured training (population) that we call training instances are represented by genomes (strings or files) and groups of them called species in NEAT, can be started to train from existing genomes, wich is what i do during minning.

    I have encountered a problem, and it is that perhaps ethereum do not offer the possibility to change the minning function. Do you Know any other method of doing this rather than forking an existing project like BitCoin, ehtereum?
    (the best option now is PPCoin https://bitcointalk.org/index.php?topic=101820.0)
    Post edited by harveybc on
  • JasperJasper Eindhoven, the NetherlandsPosts: 514Member ✭✭✭
    If you do not fully understand what you are reading, you can't say it is speculative, please be specific.
    Well, if i find something incomprehensible, and i think it is infact because it vacuous, i will say so, despite 'not understanding'. Should be glad that i didnt do so, especially considering there are bitcoin miners worrying their machines will devalue.

    Anyway, i think the main link missing in your explanation is how do bitcoin ASICs fit into this.
  • harveybcharveybc Cali, ColombiaPosts: 9Member
    You are right. Sorry but i was having a bad day. I will try to hear all the opinions carefully sisnce every opinion is helpful.
  • chris613chris613 Posts: 93Member ✭✭
    edited May 2014
    @Jasper‌ I don't see anywhere where Harvey is suggesting this is something that would bring new life to asic miners. He is suggesting a replacement for the proof of work algorithm so that it is not wasteful - it serves to train AIs at the same time that it secures the blockchain.

    These ideas are starting to make more sense to me, looking forward to seeing some more concrete formalizations.
    Post edited by chris613 on
  • JasperJasper Eindhoven, the NetherlandsPosts: 514Member ✭✭✭
    Right, i got totally sidetracked by the mention of 'the computational power of bitcoin'.
  • StephanTualStephanTual London, EnglandPosts: 1,283Member, Moderator mod
    Right, useful PoW, the eternal question :)

    An algo is only valid as a PoW algo if it can a) be - considerably - harder to solve than to be verified (hashes are great for that, obviously) and b) made to scale in difficulty by adjusting some arbitrary parameter (in bitcoin, nonces)

    Note that the above has nothing to do with turing completeness, as PoW turing completeness does not automatically grants the algo 'usefulness'. The tentative solution proposed by Gavin in his yellow paper (http://gavwood.com/paper.pdf) for example, solving corrupted contracts, is turing complete. Is it useful? Well, it secures the network (just like sha256 does on bitcoin) and if proven to be sequential memory hard as well could help prevent mining centralization. That said, it's not folding protein either (a useful thing for humanity, but useless for PoW).

    So the question to @harveybc is, is training NNs something that can be somehow made appropriate as PoW, and/or do you have maybe tentative information from somewhere that might hint that it could be the case?
  • chris613chris613 Posts: 93Member ✭✭
    I see problems with the verifiability of what Harvey is suggesting, but I'll reserve my judgement until I see some code and/or math.
  • harveybc0harveybc0 Posts: 29Member
    edited May 2014
    I will try to make all the formalizations required to do a fully implemented recursive distributed information processing machine based on the 'proof of work', for applications that generate (obviously hashable) segments of information that can be used from other miners, but the process has to have a mechanism estimating the processing power required to process the segments of information, wich a particular application of it is the distributed IA. I have no background on IA nor in P2P networs and am an entusiast of both, only have done a distributed NEAT using the PhD dissertation paper from Ken Stanley and know a little CUDA to try to make some money (for gaming) from Forex in 2008 or 2009, but i failed miserably and abandoned that project to finish my career (i am electronics Engineer ). I will try to make everything work together maybe using Ethereum for the transaction management of the machine. But i will take some time.

    For your entertainment only, apart from the useful Research and Internet of things applications, there is another (ugly?) usage for IA, this is the year when an AI expert did its appearance at the Forex World (Approx USD$3Trillion/day market):
    http://championship.mql4.com/2007/

    This means that while minning bitcoin, the people would be also minning real money using distributed experts, if the best expert is trained and the network widely used. That is speculation, but that sidetrackedme and scaredme in particular how they tolerated the economical perturbations and become multi-coin experts that trade in all the coins at the same (and sense the changes in all markets).
    Post edited by harveybc0 on
  • harveybc0harveybc0 Posts: 29Member
    I doubt this code do anything useful to anyone even if someone manages to tune the plethora of parameters and run it, it is very inneficient but works. I did it in C because i thought i was able to memory-tune the process for optimization. Big Mistake. Thats why now use Encog.
    But it works for training a NEAT expert in 5 different machines with the same dataset (market data), i used the same technique described for the Singularity Minning. I mean, each of the machines(like minners) run a totally independent NEAT training, but they shared the best genomes when an increment of fitness is reached, the others check the if the genomes published had major fitness than the population they are training and exchanged the genomes if true. The funny name of the project comes from the technique that killed Bill in Kill Bill Vol 2, movie of wich i was a die-hard-fan, it includes some code in MQL4 used for the interfacing with Forex. As i mentioned earlier is a very ulgy unmanaged code, but it is at least commented (in spanish).
  • harveybc0harveybc0 Posts: 29Member
    I wonder if someone did this profitable activity in some of the past years with Worms or Viruses jejejej. I am totally poor now and then so i am not guilty of anything. I say this because of the hate some people express to the project specially in chat rooms.
  • harveybc0harveybc0 Posts: 29Member
    For those who do not know the foreign exchange market or Forex, a decentralized market of real coin i think some brokers even support BitCoin trading....

    http://en.wikipedia.org/wiki/Foreign_exchange_market

    Some time i heard in an opinion of a Forex Automated Championship winner that the IA if widely used could reduce speculation in the markets.
  • harveybc0harveybc0 Posts: 29Member
    Apart from the Kolmogorov complexity of the experts, the population size (a NEAT parameter) used for training for each miner can be used as a measurement of required processing power to neuroevolve a expert. The experts and the datasets are shared resources.
  • harveybc0harveybc0 Posts: 29Member
    Ethereum would be very very complete if were implemented some protocol or interface to change the minning system (i do not know about feasibility of that) and a LOT of applications/coins like Singularity can be made in it. I am forking a very simple CPU Bitcoin miner and client and i will try to probe the whole concept starting there. All suggestions welcome. If someone wants to start a parallell job doing the same thing, i will gadly collaborate conceptually and by coding.
  • JasperJasper Eindhoven, the NetherlandsPosts: 514Member ✭✭✭
    @harveybc0 there is only one mining, and the entity of Ethereum users are dependent on it. So if there was a mechanism to change it, there would have be a mechanism to do so. Not going to happen in Ethereum1.0 i reckon.

    Secondly mining is there for security. It also does issuance to 1) handle the ether supply against hoarding 2) incentivize the miners. That it does issuance essentially also means that you are saying we should pay for this singularity thing. Not saying this is necessarily wrong, just making it explicit.

    You could also have it not being issuance, and try figure out how people might want to pay for the activity. If a contract can register the work being done securely, you could make it part of the businessmodel of a contract.(like a DAO)

    Note that if you would do it solely for (2) and we found this useful, it probably wouldnt be mining. It would just be in the form of a contract with 'a magic wallet'. Exceedingly unlikely this is going to happen in Ethereum1.0.
  • harveybc0harveybc0 Posts: 29Member
    Thanks for the information @Jasper. I am just starting to examine the cpu miner code, i think after that i will need to modify a client also to implement the program. When i have something ready to show i will publish it here.
  • JasperJasper Eindhoven, the NetherlandsPosts: 514Member ✭✭✭
    Unless it is really a 'public service' kind of thing where it is difficult to get people to pay for it, i would try to make a contract that can measure if work has been done.

    Generally, earlier on i was also interested for mining to do some useful thing. But now i realize that the level of activity those useful things is better determined by demand than having them be controlled by the mining reward. (of course, this is unless it also happens to secure the network)
  • harveybc0harveybc0 Posts: 29Member
    edited May 2014
    I am realizing that @Jasper‌ is totally true, and will be too complex to do as i was thinking before. A better approach is to use Ethereum to create a secure rating system (coin as a description of quality of the resource) based on usage of the experts, while the management of the trained experts wich requires protocol modifications is made in a separate distributed network (SingularityNet jejej) that produces resources that are used/shared by the clients in interfaces like Forex or RaspberryPi, generating secure (bitcoinlike) transactions in the Ethereum coin during usage of the client nodes of SingularityNet, like a Interface of both networks (the coin network and the resource-producing network). The communications security required to implement the SingularityNet is different and may require a lot of work to adapt to the current bitcoin, it would require a diferent coin system. So my miner nodes only are connected to the SingularityNet while my client nodes are connected to both networks. So i will continue the work in the bitcoin minner a start to the implementation of the resource-producing network.
    Maybe at application-level the Singularity coin miners, may mine both coins at the same time (something like 1% of CPU for the ethereum coin and 99% for the SingularityNet) and that is the attractive part for the minners.
    Post edited by harveybc0 on
  • StephanTualStephanTual London, EnglandPosts: 1,283Member, Moderator mod
    @harveybc‌ consider using http://boinc.berkeley.edu/ as your Grid computing platform, and then interface it with ethereum for the issuance of the rewards. This is a topic I'm very interested in. Will be watching closely.
  • ranfordranford Posts: 25Member ✭✭
    @Stephen_Tual, @harveybc , that would be great, but then again there is already the Gridcoin which basically does that (but not in ethereum of course). I doubt if it would make much sense to start an ethereum based alternative to gridcoin, unless it was able to offer some substantial advantages/new take on the concept. Possibly the advantage would be to allow more specific targetted use of the boinc for specific purpose currencies rather than just 1 general grid currency as per gridcoin. Overall I also find this intensely interesting too.
  • harveybc0harveybc0 Posts: 29Member
    Thanks for all the advice, i will check if the plataform can be adapted or use anything existing.
  • harveybc0harveybc0 Posts: 29Member
    edited May 2014
    @ranford, @Stephan_Tual I would like to use GridCoin minning or other BOINC minning system for the reputation/quality coin (1% CPU at app level mentioned before).
    But the basic approach for the training of the AI using minners and using them to evaluate the experts (like internet of things) is the design problem i am facing now.
    I think i have a solution, i would like to use a blockchain for each of my experts (inf.scalable) and all the miners of that n-coin receive the respective payment when one of them finds a measurable efficiency increase proportional to the computational effort used since the last increment of efficiency by each one in a very simple manner, so each blockchain traces the changes made to each expert as it increases efficiency, and the coin measures the expert value measured in invested CPU (n-coin).
    I am working in the protocol for the sub-Coins generation and measurement, as they do not need other transactions because are only accumulative.
    The coin forged for each miner is representative of the value of the expert in the Gridcoin System or similar.
    That is the basic idea. There is la lot of design to do, but the concept remains the same, and now we help BOINC, thats very good. Any suggestions? Thanks
    Post edited by harveybc0 on
  • harveybc0harveybc0 Posts: 29Member
    edited May 2014
    The BOINC Network is ideal for evaluating the pre-trained-experts for the clients to use in internet of things/research/forex as they use a separate network, and a interface with this network for genome downloading(shared resource) can be done. And Singularity n-minners(using ~99%CPU) only produce the n-coins that have exchange fees to Singularity coin (BOINC minned at ~1%CPU) based on the accumulated value and last efficiency/cpu usage ratio for each n-coin.
    Post edited by harveybc0 on
  • harveybc0harveybc0 Posts: 29Member
    edited May 2014
    Another very good approach am exploring now is to Introduce a neuroevolution iteration directly before every hashing iteration during the normal minning process. That seems feasible, and would introduce an iteration (or more) of neuroevolution for every iteration realized by the minner in the minning process (like doing 50%bitcoin/50%singularity). But this has many other aspects to manage like the variable evaluation times per genome. But it may work. Wich one do you think is the best way, the n-Blockchain, the minner Hack or both?
    Post edited by harveybc0 on
Sign In or Register to comment.