Share your product feedback!

Please tell us what we can do to make Lovable the best product for you.

Token Limits and Processing Capacity Per Account

Hello I used Lovable for free for a few days and was really happy that your system handled my large requests. I sometimes sent in 100 lines of text, and neural network processed over 60 of them. I can imagine how many tokens were burned, and yet there were hardly any bugs. Much respect and gratitude to you for that. But today, in every request of 100 lines, only about 5–10 lines get processed at best. Can you explain β€” is this reduction related to account status? Do new accounts get some kind of welcome token bonus? Or is this a platform-wide change affecting all users? How many lines of text are processed with a paid plan? What are Lovable's plans for the future? Does Lovable have the economic sustainability to keep handling large requests β€” does the token spending model work out? Is there a reason to expect more power and greater processing capacity per request in the future, or will the platform focus more on optimization instead?

User 25 days ago

2
⁉️

Questions

Give feedback when something isn't possible in the environment

I was trying to create a UPNP/DLNA audio player, and Lovable was able to identify my devices, but kept finding ways not display the content from the server, instead finding multiple ways to create mock content while claiming it was working on it. Until I finally asked why it was avoiding doing the work. It added to my library page: β€œNo content found on UPNP servers. A backend service is required for actual UPNP communication. UPNP/DLNA protocol requires direct network access which browsers cannot do for security reasons. A real implementation would need: A backend service that can use SSDP for device discovery SOAP communication for content browsing DLNA protocol implementation for streaming This could be implemented as a Node.js server, Electron app, or browser extension with appropriate permissions.” OK, this is logical. But I wouldn’t have spent nearly so long chasing my tail, if it had told me this up front.

Jason Perez 28 days ago

πŸ›

Bug Hunting

Should: Automatic Cache Clearing for Version Control Operations

Problem Statement Currently, when switching branches in Git or reverting to previous versions in Lovable, the TypeScript and build caches are not automatically cleared. This leads to: Stale type definitions causing false TypeScript errors Build artifacts from different versions conflicting Developer time wasted manually clearing caches Confusing error messages that don't match the current codebase Examples of Issues TypeScript errors about missing properties (e.g., 'is_overflow') that should exist Build cache conflicts between different versions of the same component Persistent type errors even after reverting to known-working versions Proposed Solution Implement automatic cache clearing triggers for: Git branch switching operations Version revert operations in Lovable Code restoration from previous edits Technical Implementation Details The system should: Clear the TypeScript compilation cache Remove build artifacts Clear Vite's build cache Regenerate type definitions Trigger a fresh build Key Folders/Files to Clear .tsbuildinfo files node_modules/.vite node_modules/.cache Temporary build directories Success Criteria No TypeScript errors when switching branches Clean builds after version changes Automatic type regeneration No manual cache clearing needed Additional Benefits Improved developer experience Reduced debugging time More reliable builds Consistent type checking Priority Blocking - This is blocking my development Related Issues TypeScript errors when switching branches Build inconsistencies after version changes Manual cache clearing requirements

AndrΓ© Brunetta 28 days ago

1
πŸ›

Bug Hunting