The attendees at these developer conferences are often very gung-ho about the technologies they are working on. They are wrestling with technical problems, and - given time, effort and processing muscle - these problems inevitably yield to technical solutions. "Look at our cool thing!" Then their tech gets loosed on the real world...
Last year, Adobe released [EDIT:
demoed - it's not yet released] '
Voco', which can 'listen' to a few samples of someone's voice, and then you can type in whatever you want, and that person's voice will 'say' it. The audience were all, 'woooo', but I found it quite chilling. Horrifying, even.
Provided the solution is within the realm of physics, technical hurdles get cleared eventually. The social/political/commercial problems are much more intractable. That's where regulations come in; an AI must identify itself. Recipients must opt-in. Contracts entered into with an AI are not binding (on the human) until ratified in some way. That kind of thing. Because 'bad actors' will be looking at the last few years' conference proceedings of Apple, Google, Adobe, Facebook, Amazon and salivating.