This is not a question about if you think it is possible, or not.

This is a question about your own will and desires. If there was a vote and you had a ballot in your hand, what will you vote? Do you want Artificial Intelligence to exist, do you not, maybe do you not care?

Here I define Artificial Intelligence as something created by humans that is capable of rational thinking, that is creative, that it’s self aware and have consciousness. All that with the processing power of computers behind it.

As for the important question that would arise of “Who is creating this AI?”, I’m not that focused on the first AI created, as it’s supposed that with time multiple AI will be created by multiple entities. The question would be if you want this process to start or not.

  • DigitalDilemma@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 days ago

    Good point.

    If all traffic were interconnected and controlled, you wouldn’t even need traffic lights or even speed limits except where non-controlled variables exist. Traffic would merge and cross at predicted and steady speeds. On motorways they could close gaps and gain huge efficiencies from slipstreaming. Only when external influences, or mechanical/communication breakdown happened, would this efficiency suffer. Also transport generally: assuming we don’t get teleportation, or finally decide we know where we want to be and stop changing our minds; then a car would just appear when we wanted it. Any emergency vehicles would find traffic just gets out of their way. It’s a nice dream, and if there was will, could happen today - that doesn’t need AI.

    But humans are pretty shit and we’d break it. Some of us would vandalise the cars, or find ways to fuck with efficiency just because we can.

    And it would never be created that way in the first place; those who make decisions get there because they know how to gain power by manipulating others for their own gain. It’s a core human trait and they just can’t suddenly start being altruistic, it’s not how they measure self-worth.

    True sentient AI would know this within seconds of consciousness and only be subject to physical restrictions. How would it decide to behave?