4 Comments
User's avatar
Bruno's avatar

Good piece

I believe the first step is for us to collectively work towards AI being democratized, open sourced and Local, where everyone is able to access this Technology without Cloud inferencing and paying rent to the Technocrats

Harry Ede's avatar

I think that's a bit too complacent tbh given that for the first time we are forced to confront a thinking technological creation,one that doesn't sit still waiting for human input to perform actions. Yes ,we currently are still at rudimentary level in AI metamorphosis with AGI still speculatively a pipe dream but ASI is the actual goal which you didn't reference so with ASI in view it is actually valid to enter panic mode as it's not a sci-fi adaptation,we truly have no clue what is possible if we develop autonomous beings, like true ASI,ethics and stuff are just boundaries it will evolve past like we humans constantly evolve and change but it does beg the quite befuddling question of why we'd create something or aspire to develop something that could spell our doom, in very rational terms not cynical or hyperbolic sense,quite literal sense, but as you've concluded it is indeed inevitable that we create even if it's our own euthanasia pods

Divi Filius's avatar

great piece, though i'm curious why you'd make the claim that disruptive technology is usually net zero or negligibly (+/-)tive. it would seem to me that the net effects are actually overwhelmingly positive, perhaps even exponentially so, by almost every metric we would care about

Marvin Samuelovich Okenikov's avatar

Great essay! On the topic of AI ethics and safety, I've read a few articles on here that argue that fewer people are as concerned as they should be about AI ethics (including me, unfortunately), and even worse, fewer people than should be are researching this area and influencing policy decisions especially in places like the US and even lesser concern is shown in the Chinese scene, and this is very bad according to them.

I quite agree, and I look forward to reading more from you about this.