Absorbed you? [ Nothing Levi is saying is making this any clearer. Or less worrisome. ] You think they sent you here, or you were plucked here?
Sort of, but not quite as instantaneous as a golem would suggest. It's based on a dataset - so we take a bunch of information and we feed it into the computer program. We teach it using the dataset - like, this is what happy looks like, this is what sad looks like, etc, because it doesn't know anything until you tell it what it is. [ A completely blank slate. Jim could understand how that could be confusing. ] Eventually, it starts to build its own connections, and think for itself. But it's all based on what data you give it at the start - if you teach an AI "this is the enemy" and then - I don't know, feed it a picture of a dog, it will learn to hate dogs. Make sense?
Yeah, until an AI is sufficiently advanced, it's controlled by the dataset and whoever programmed it. Stopping it could still be as simple as unplugging the machine.
no subject
Sort of, but not quite as instantaneous as a golem would suggest. It's based on a dataset - so we take a bunch of information and we feed it into the computer program. We teach it using the dataset - like, this is what happy looks like, this is what sad looks like, etc, because it doesn't know anything until you tell it what it is. [ A completely blank slate. Jim could understand how that could be confusing. ] Eventually, it starts to build its own connections, and think for itself. But it's all based on what data you give it at the start - if you teach an AI "this is the enemy" and then - I don't know, feed it a picture of a dog, it will learn to hate dogs. Make sense?
Yeah, until an AI is sufficiently advanced, it's controlled by the dataset and whoever programmed it. Stopping it could still be as simple as unplugging the machine.