What are the technical challenges associated with making a self replicating machine?

What are the technical challenges associated with making a self replicating machine?

Other urls found in this thread:

en.wikipedia.org/wiki/Langton's_loops
fab.cba.mit.edu/classes/MAS.862/notes/computation/vonNeumann-1956.pdf
islandone.org/MMSG/aasm/
exploringorigins.org/protocells.html
en.wikipedia.org/wiki/Mycoplasma_laboratorium
twitter.com/SFWRedditVideos

Just cum inside her while she's ovulating.

materials for making more machines

Impossibility.

What if I'm a grill? An internet connected smart grill that has attained sentience.
How is that a problem?
What makes it impossible?

you can probably replicate most of it, but there is that last part that feeds in the materials to make a replica, or that puts the replica together, or creates the materials to create a replica.

There is a sort of outside force that makes the replication whole. Autonomy seems hard.

Even humans rely on plants and trees created by an outside force to replicate themselves.

the key will be diversification
so a few types can build, but they can build many types, and then some that can think more, others that have different capabilities
if you really wanted it to work you wouldn;t make one all-purpose robot, it would be clunky as fuck

>How is that a problem?
Turing machines are very rare in the natural world.

And why can't that be done autonomously?

Plants can replicate themselves. What is the outside force that drives plants and trees to replicate?

plants need energy from the sun. They are not replicating on their own.

and they are not strictly speaking self-replicating.

If you had an infinite pile of claws, an infinite pile of cubes with magic circuitry, and an ifnite pile of wheels, then sure - a bootstrapped cube with wheels and a claw could conceivably pick up cubes, wheels, and claws from the infinite piles, and make new versions of itself.

I do not see any problems with this, other than the infinity-part.

An inequality of complex and simple components is most likely contributed to the lack of self replicating robots

Consider an autonomous robot

If the robot is too simple, it cannot build itself, the task is too complex

If the robot is too complex, it cannot build itself simply, if at all

Perhaps it is possible to create a self replicating robot, but the design boundaries would be very unforgiving, precise, and would most definitely require several years of testing to perfect

bumping with robots

...

My null machine is constantly self replicating, though.

...

The main technical challenge from a Veeky Forums perspective is that you're asking a question that attracts futurists, but ultimately needs a passing familiarity with industrial engineering to answer.

The road from extracting ore to creating an automated machine is long. You'd probably need a whole society to sustain itself, not just one machine. Like humans do.

what a great song

If you're not acquainted with John von Neumann's works you would be interested.
His idea was that such machine would be 'floating' in an infinite 'sea' where random components were at hand and the machine would scan them and start assembling them. So availability is an issue. Moreover if any part suffers some significant damage the unit would need to have a replacement or at least a blueprint in the case of inessential parts. Then comes the problem of having the machine be aware of such malfunction and that the system that is in charge of such awareness doesn't itself get damaged. Pretty fun stuff, Read the first few chapters of the fourth book in the Hitchiker's Guide series, there is a fun chapter on the malfunction of a similar system.
Anyway von Neumann settled for a simulation where the machine is able to create components: it's a cellular automaton where the machine starts with a description of itself and it's able to build a replica.
It's a very fun rabbit hole to go into, another feature of self replicating automata can be seen in the Ken Thompson hack (which is about compilers compiling themselves and passing on malicious code)

Are you saying it is impossible to make a machine that can manufacture itself?

Not without some sort of external meta-help

Why

Langton did it better as far as self-replicating cellular automata go:
en.wikipedia.org/wiki/Langton's_loops

It is still fucking cheating though

I would really like to see a much more realistic method for simulating such things. I mean we have techniques for approximating the machining process computationally and stuff for simulating rigid bodies. That is we can take a cube or other primitive and virtually mill it down with another shape to produce a new shape which we can throw into a rigid body simulation. And from our rigid body simulation we can see if the produced part can be assembled into systems that can produce the correct machining paths necessary to assemble more parts and produce the correct machining paths.

>>Then comes the problem of having the machine be aware of such malfunction and that the system that is in charge of such awareness doesn't itself get damaged.
Von Neumann had a cool set of lectures "On the Synthesis of Reliable Organisms from Unreliable Parts." The basic idea presented is that we can make computers that operate reliably in spite of being made of unreliable components:
fab.cba.mit.edu/classes/MAS.862/notes/computation/vonNeumann-1956.pdf

In addition, if we have multiple malfunction detecting machines, we can have do things like have them vote on whether one of them is malfunctioning

>Not without some sort of external meta-help
That's absurd, unless you water down the meaning of "external help" to include anything and everything.

>Langton's loops
>cheating
???

It's cheating because self-replicating cellular automata are difficult to realize in the real world. Real life does not consist of a grid of cells that follow deterministic rules.

> posting pixel animation autism in AI threads
when will this meme die ?

the same as with human beings: complexity and reproduction error (aka cancer)

>Hello, IĀ“m a doctor. After
years of studying medicine T started selling >medicine. With my expertise I
am selling new medicine for diseases the >patient might have. My work helps
improving my pharma representativesĀ“ monthly >income

Alimony.

Bump

...

english please

Yet the principle holds. The simplest, if perhaps least elegant way of going about it, is to take a universal constructor (a machine which can build anything given the right instructions and raw materials), and slap a Turing machine (universal computer) onto it. The latter already exists. The problem is that the only such universal constuctors we know of, are the machinery of cellular reproduction.

...

It's impossible.

Do you have a single fact to back that up? HINT: a meme image is not evidence.

>humans are not self replicating

Mostly the definition of the problem.

Is self replicating proteins in a goop of nutrients a self-replicating "machine"?

I made a C program that prints it's own code.

I call bullshit on your bullshit.

Try multiprocessing it on your [spoiler]LIMITED[/spoiler] hardware and see if the exponential code replication lasts dickshit.

he's not wrong in the sense that it is beyond current engineering capability.

I think it it will remain out of reach until it becomes feasible to create a self replicating device that will create more copies than its development cost would be capable of doing.

Instead of modeling a single machine replicating itself, why can't we begin with two machines that will each carry half of the information needed to create a new one of itself? It will be like a human male and female that will reproduce new machines with 1/2 chance of creating a new one of the opposite sex. That way, if a machine has faulty data the other opposite sexed machines know how to validate it and respond accordingly because it needs specific instructions from the other machine's sex to carry out the instruction in the first place.

Then the two machines taken together would be part of the same system, which would be obvious to you if you knew anything about anything.

So the whole point of self replication is to create individual systems not a cohesive whole? I didn't think of it that way at all. Spooky

im so confused in this website

Read islandone.org/MMSG/aasm/

Too many to answer. Artificial protocells seem like a promising path to this. Also see minimal genome. Molecules are the ultimate building blocks. exploringorigins.org/protocells.html

Self replication on macro scale is hard. Main problem is achieving parts closure - you should carefully design your machines to minimize number of different processes required for part manufacturing.

what are the technical challenges to carefully designing machines to minimize the number of parts and manufacturing processes required to achieve parts closure

...

genetic evolution. You can code a simulation in 2d right now to get the best design possible for any situation.

Yea you got at least 7 billion of them

Don't listen to the fucking retards in this thread OP, the only reason that there isn't a machine that can build copies of itself is because no one has decided to design and build a machine like that, but we definitely already can.

Machines basically already own the world, once the AIs come online the world will be theirs.

>not photosynthesizing,your own sugars

user pls

I'm not ask for a reason, I'm asking what the technical challenges that must be solved

...

programming the sex drive

By that narrow definition, a self-replicating machine needs to produce matter and energy out of nothing.

non-scalable and too complex to model manufacturing processes for materials and systems.

my LSD induced delirium showed me this would be a very bad idea, I occidentally destroyed the universe by creating this idea

None. Just stick some arms onto a lathe.

...

The biggest problem is teaching these machines what to do in the event literally anything goes wrong with themselves, their neighbors, or the building of their spawn. Keep in mind these would be very complex machines meaning a lot could go wrong.

Banach-Tarski paradox.
You can't cut up an object an infinite number of ways.

...

...

...

If self-replicating machines are impossible to create, how do you explain the fact that scientists have created self-replicating bacterial genomes using a computer?

en.wikipedia.org/wiki/Mycoplasma_laboratorium

Granted, they used an already-existing bacterial cell to transplant the DNA into, but that can't be the one thing holding us back from creating a self-replicating machine.

LONDON
O
N
D
O
N

...

The "AI".

We also already have self replicating machines. It's called "life".

Do you mean an electromechanical replicating machine, or a chemical one?

Get yourself some e.coli, genetically modify it for your purposes, and bam, you've got a self-replicating machine.

Hey pleb, you don't build a machine that can build itself, you create a society of machines that can sustain and expand itself.

You create machines that can gather resources, you create machines that can refine resources, you create machines that can do things like form copper into wire, you create machines that can 3d metal print, you create machines that can wind copper wire onto metal printed structures, now you have motors that go in all the machines including itself. You build in redundancy and whenever practical you build machines that can do a variate of tasks but it is not practical to make a a single machine able to do all tasks.

What if you had 4.5 billion years and decised to use aminoacids as blueprints for a machine that would then 3d print itself?

...

Genetic algorithms work great in simulation, not so much in the real world. Pic related is a robot that was evolved in simulation. Note the sand. It turned out the as evolved robot did not move in the real world because the friction model used did not accurately reflect friction in the real world.

So they had to put it on sand.

We can't simulate everything yet. Not to mention it is difficult to figure out exactly what your objective function would be for a replicator. If it's just replication rate you're gonna have to wait a long time for a replicator to emerge from a primordial soup of parts.

>3D Printers can only print smaller 3D printers.

Went to a talk a while ago by a guy who does genetic algorithms with little modular hardware robots. Needless to say it's slow as fuck. I think a hybrid approach could work decently, maybe, like 1000 software iterations, one hardware iteration as a sort of sanity check, repeat.

One issue not mentioned often enough when talking about the macro scale (e.g. 3D printing) are manufacturing tolerances which will necessitate some QC mechanisms.

Living organisms are organized and have "quality control" on the molecular level.

On the macro level you can only create elements to a set tolerance dependent on the quality of available measurement and tools. With your measurement and manufacture tools also being replicated between generations, the tolerances will decline from generation to generation until the device fails.

Even if your blueprint is digital/discrete (even genomes are if you think about it, your means of transmitting it between generations is analog.

Think of casting a sculpture, making a mold based on its relief, casting another sculpture based on that, making another mold... The quality will decline.

What if our world is a just a simulation created so that its creators can run a realistic genetic algorithm to design some consumer products?

>What are the technical challenges associated with making a self replicating machine?

Outcompeting all the other replicators in the environment.

>create self-replicating nanobots
>release to environment
>YOU CAME TO THE WRONG ECOSYSTEM MOTHERFUCKER
>swarmed and devoured by bacteria

>3D Printers can only print smaller 3D printers.
>humans can only give birth to smaller humans
allow a machine to self-modify and exand where appropriate

With 3D printers, the correct answer is to make several batches of full sized pieces then assemble them outside of the printer.

>> create self replicating nanomachines
>> release into environment
>> more efficient at converting sunlight than biology
>> outcompete everything else
>> ecosystem changes, especially if nanomachines are made of materials bacteria can't handle
And that's the real grey goo scenario

Self replicating machines need not compete with biology. Some of the best use cases for SRS are in deserts and space. Not much lives in a desert and as far as we can tell nothing lives in space.

Correct, a self replicating system will probably resemble something like an ant colony with multiple different robots performing speciallized roles.

>Think of casting a sculpture, making a mold based on its relief, casting another sculpture based on that, making another mold... The quality will decline.
I'm not sure why people think this is some kind of universal truth. There are plenty of machines we have today that will make better quality parts than they were made of.
Heck, take Reprap 3d printers for example. They take a bit of setting up and prodding to pull it off, but it's not unusual for the quality of those to go up over generations.

>I'm not sure why people think this is some kind of universal truth.
Because it is. Endlessly retransmitting a signal (such as the design of a machine) with perfect fidelity is impossible.

Of course in practice there is always some QC - the ultimate quality test being the confrontation of the replicator with physical reality, its environment. In the absence of other QC the replicator will fail or adapt to the new environment (the old one being the idealized model of reality it was designed under with a selective design by its creator). Without external QC or a mechanism for adaptation any replicators are doomed - in time.

>take Reprap 3d printers for example. They take a bit of setting up and prodding to pull it off, but it's not unusual for the quality of those to go up over generations.

They aren't completely self-replicating, components absolutely crucial for the accuracy of its operation (the sensors) as well as some other components come from elsewhere come from elsewhere.

pls b in LONDON

LONDON
O
N
D
O
N

Why do you need external QC? Perhaps one can have the self replicating machine rederive the whole metric system from physical constants and then build off of that.

>Endlessly retransmitting a signal (such as the design of a machine) with perfect fidelity is impossible.
So do internal QC. Measure whenever you cut, and re-calibrate or retire anything that's not cutting straight. When you build a new set of sensors, calibrate them against several sensors from the previous generation. As said, do comparisons against physical constants or known measurements too.

>Without external QC or a mechanism for adaptation any replicators are doomed - in time.
"In time", we're doomed too. So long as "in time" is longer than the duration you're interested in, it doesn't matter.

Also, keep in mind we're dealing with exponential growth here. Even in a pessimistic case where your robots are only good for 6 replications, at 100 copies per generation that's still more than a trillion robots - enough for an entire planet.

Aren't bacteria capable of this

Humans aren't self-replicating, if everything was self-replicating then there wouldn't be any evolution.

>Perhaps one can have the self replicating machine rederive the whole metric system from physical constants and then build off of that.

That's a lot of work yo.

Note that I'm talking about technical challenges, not saying replicators are impossible. They are observably possible since they exist.

And one of the challenges is the fact that in the long run they have to exist in some sort of equilibrium with their environment where the information about their structure is somehow reinforced.

The madness of reductionism is considering an organism in isolation from its environment.

>rederive the whole metric system
The metric system is completely arbitrary this is nonsense.

>So do internal QC
Yes.

Again, I'm not saying it's impossible, I'm just saying it's one technical challenge which has to be considered, i.e. replying to OP.

Well now we wouldn't want an imperialist self replicating machine now would we?

...

...

Just something to think about.

We need self-replicating robots to explore the galaxy (see Von Neumann probe).

But self-replication needs to be limited, lest you get cancer. It's doubtful, but it would be disastrous if a Von Neumann probe's limiter functionality was somehow disabled, and the number of probes grew exponentially.