
AstonJ
How useful do you think AI tools are/will be?
AI has been a hot topic here on Devtalk recently, so along that theme: How useful do you think AI dev tools are right now and how useful do you think they might become?
There are multiple polls below, please vote in each one
How useful do you think AI tools are to developers right now?
- Not useful
- Somewhat useful
- Moderately useful
- Very useful
- So useful that I consider them a necessity
How useful do you think AI tools will be 5 years from now?
- Not useful
- Somewhat useful
- Moderately useful
- Very useful
- So useful that they would be considered a necessity
How useful do you think AI tools will be 10 years from now?
- Not useful
- Somewhat useful
- Moderately useful
- Very useful
- So useful that they would be considered a necessity
How useful do you think AI tools will be 15 years from now?
- Not useful
- Somewhat useful
- Moderately useful
- Very useful
- So useful that they would be considered a necessity
How useful do you think AI tools will be 20+ years from now?
- Not useful
- Somewhat useful
- Moderately useful
- Very useful
- So useful that they would be considered a necessity
Most Liked

mindriot
10-20 years out is a long long guess. Small pool of results so far but I was a bit surprised that although being pretty skeptical about the claims I tended to think they are more useful in the near term and less useful in the long term. I have pretty deep concerns about the ability for these things to learn or keep up with improvements with the current level of usefulness already resulting in a deluge of slop being the learning pool of future training of statistical models. The long term feels to me like it needs another paradigm breakthrough in the order of the current LLM hype to even keep these things as useful as they were before people started using them.

mindriot
Well I’m not saying anything for sure about the future. What I am saying is that the breakthrough that just happened was a question of consuming all the content on earth to train on (with questionable respect to who owns said content). So now we have statistical models that output the average of what was available (or in other words the bottom 20% in terms of quality, if you’re lucky). Then everyone uses said tools to massively increase the quantity of content that is in the bottom end of quality. So we are dramatically shrinking the statistical impact of new valuable information to train on relative to junk being emitted into the digital world.. and we have consumed what it took to get here.
Let’s just say we keep going and it does get better. Some tomorrow there is a new security issue, deep fundamentals like everything we do is a sql injection attack level crisis on every request. Who is going to get this to a statistically significant data point to actually influence the training? Not anyone using AI tools right. (On this point, copilot has flagged several hundred prs I’ve created in work repos over it not knowing about new features in languages or frameworks and it has only been a reviewer on those repos since early this year - it has to be fair also made a few reasonable suggestions but those don’t relate to this specific point about it saying random outdated nonsense that is objectively false).
Further, how any new technology or approach become part of the solution? We end up in a world where everyone uses a tool based on the current situation to create the next situation and how it sees the world dominates what it can learn from to the exclusion of other inputs unless someone intervenes.
Which brings a me to the next point, what happens when these companies don’t want to lose millions per day setting fire to rainforests to power this hugely inefficient monster of mediocrity? The obvious move is they sell training bias to influence what people use as a form of advertising.
On the other hand, let’s pretend that it doesn’t play out like this. It is some exponential curve of god like power that takes over all need to do any work. Who is going to buy this? Who are their customers? When you create a situation where some 60%-70% of the workforce becomes jobless in a 5 year window like the hype sells, who are you selling your product to? Best case is the collapse of capitalism, but it could approach an extinction level event if played out like hacker news and twitter seems to sell CEOs.
Whatever way, I see the current bump as a short term thing. Either it balances out or we are all living mad max style in a few years. No one really wins because neither the idea or advantage is sustainable. (But I’m not saying anything about the future, every day it seems like some significant section of humanity is doing something more stupid than I thought possible… maybe for the second time)

joeb
Thanks. I am probably an old head in the sense that I still want to do most of my coding and thinking, and not just depend on an AI assistant to do most of my work.
Popular Ai topics










Other popular topics










Categories:
Sub Categories:
Popular Portals
- /elixir
- /rust
- /wasm
- /ruby
- /erlang
- /phoenix
- /keyboards
- /rails
- /js
- /python
- /security
- /go
- /swift
- /vim
- /clojure
- /java
- /haskell
- /emacs
- /svelte
- /onivim
- /typescript
- /crystal
- /c-plus-plus
- /tailwind
- /kotlin
- /gleam
- /react
- /flutter
- /elm
- /ocaml
- /ash
- /vscode
- /opensuse
- /centos
- /php
- /deepseek
- /scala
- /html
- /zig
- /debian
- /nixos
- /lisp
- /agda
- /sublime-text
- /textmate
- /react-native
- /kubuntu
- /arch-linux
- /ubuntu
- /revery
- /manjaro
- /django
- /spring
- /diversity
- /lua
- /nodejs
- /julia
- /c
- /slackware
- /neovim