
CommunityNews
Cursor told me I should learn coding instead of asking it to generate it + limit of 800 locs
Hi all, Yesterday I installed Cursor and currently on Pro Trial. After coding a bit I found out that it can’t go through 750-800 lines of code and when asked why is that I get this message: Not sure if LLMs know what they are for (lol), but doesn’t matter as a much as a fact that I can’t go through 800 locs. Anyone had similar issue? It’s really limiting at this point and I got here after just 1h of vibe coding My operating system is MacOS Sequoia 15.3.1
Read in full here:
Most Liked

Eiji
I wonder what people using chatbot to generate the code think they can do with it. For example what license should be used for it?
-
Chatbots are public for everyone which means same code can be generated. If that’s so then we cannot claim any copyrights to it as with any other public data that’s not ours.
-
Chatbots does not think - they generate data not only based on input data, but also on data they were trained with. What kind of license this data have? Who knows … It wouldn’t be something new to find a copyright problems in closed-source software, right? Think that somebody would prove this. It’s not about “oh when they change license for chatbot I would not use it anymore” - it’s about already generated code.
-
Chatbots are owned by corporations and for now they are more like beta-apps. What would happen if chatbots would be able to generate a production-ready code? Corporations would change the terms of use and the license of it. Think twice why someone would do all the business stuff for free?
For sure everyone using chatbots should understand the code they are generating, but even if that would not be a problem there are tons of other problems. When you order a developer to create some software then it’s his code and his responsibility.
If you generate the code it’s up to you to prove that it belongs to you. If we keep in mind how much different law systems are and how often they are changed it looks like an endless topic full of “what if …” and without any facts.

TimButterfield
I’m not really concerned with this, at least not yet. Even with relatively simple prompts, you can get very different results even from the same AI model because the individual responses are not consistent and can vary over time. There is an API parameter called ‘temperature’ that can be used to control this somewhat, though it is not something directly controllable through most chat UI. Further, the more the AI has to guess, the more variable it becomes. I wrote a blog post about AI guessing and how to reduce it. You have to be very detailed and specific to get it to produce the code the way you want it and not how it guesses.
As for licensing of my own apps, they may have to fight over it as I use more than one AI provider to generate different things. They would need to prove which of the final result was generated by them and not by me or another AI provider. I don’t really see that happening.
The bigger issue with having AI generate complete apps goes back to the guessing I mentioned before. How can it know what your goals are for the app? There are many apps that are just slight variations from other, mostly similar apps. It cannot know which variant of anything you want unless you specifically tell it that.

chris.johan
Don’t LLMs have a limit on the number of its output tokens?
Popular General Dev topics










Other popular topics










Latest in General Dev
Latest (all)
Categories:
Popular Portals
- /elixir
- /rust
- /wasm
- /ruby
- /erlang
- /phoenix
- /keyboards
- /js
- /rails
- /python
- /security
- /go
- /swift
- /vim
- /clojure
- /java
- /haskell
- /emacs
- /svelte
- /onivim
- /typescript
- /crystal
- /c-plus-plus
- /tailwind
- /kotlin
- /gleam
- /react
- /flutter
- /elm
- /ocaml
- /vscode
- /opensuse
- /ash
- /centos
- /php
- /deepseek
- /zig
- /scala
- /html
- /debian
- /nixos
- /lisp
- /agda
- /sublime-text
- /textmate
- /react-native
- /kubuntu
- /arch-linux
- /ubuntu
- /revery
- /manjaro
- /spring
- /django
- /diversity
- /nodejs
- /lua
- /c
- /slackware
- /julia
- /neovim