AI might not just kill coding...it could swallow software itself. With Claude Sonnet 4.5 you describe what you want, and software appears in real time. That means an endless flood of apps, systems, and tools... until “apps” stop meaning anything at all. And it all happens inside what looks like a simple chatbot, where conversation replaces code. At first this just means an explosion of apps, systems, and tools. Anyone can build anything instantly. But when software is this abundant, the meaning of “apps” begins to disappear. Execution stops being the bottleneck. The scarce resource becomes judgment: deciding what is worth building, and why. Over time, that shifts the whole system. Education no longer trains us to memorize but to think critically about goals, ethics, and design. Work stops revolving around pushing buttons on screens, because the screens themselves fade. AI assistants, tied into robotics and infrastructure, carry out most tasks automatically. The computer as we know it becomes invisible, woven into life itself. We’ll still have challenges, decisions, and tradeoffs. But the nature of “work” may change from doing tasks to governing abundance. So maybe the real endgame isn’t more apps, but fewer. Not more screens, but less. A society where the computer disappears, and the questions left are: what do we build, what do we ignore, and what do we protect? What do you think? Follow Endrit Restelica for more tech stuff.
We don't need apps in the future to operate- we have one agent who does it all.
My concern now is the security issues, and this calls for operational AI Ethics and Accountability across all sectors. AI is here to automate many processes, but we can’t underestimate man-in-the-loop to consistently fact-checking these systems to ensure compliance with ethical standards and enable secure environment.
In future we don't need any websites or apps..all we need a chatbot to search and place an order. then why we need these things. Just im curious Endrit Restelica
While AI is a valuable tool, its capabilities are inherently constrained by the scope of its training data and the precision of the prompts it receives. A user's ability to craft a good prompt determines the quality of the AI's output.
Endrit Restelica Spot on. Judging value becomes the skill, not building tools. Maybe the hardest part is resisting the urge to summon a new app for every minor inconvenience.
I think we are entering an unprecedented synthetic era.
Intriguing scenario: fewer apps, fewer screens, instead software is created ad-hoc based on current needs, likely expressed by talking. We still would have to trust the organization providing the AI infrastructure where hopefully someone sits who can control the system. As a technical person I'd like to be involved on a more detailed level to feel comfortable.
Yes! I wrote a LinkedIn article about this and I like to call it the "Jarvis OS". Software is losing its value fast and eventually it will be just AI connecting to content. This is proven with Chat GPT having people now build "apps" within their platform.
🏆 $2.5B+ Client Revenue Generated | Top 1% Engineers | Fractional CTO Services | 10x-1000x Growth Track Record | ⚡ 3 Spots Left for 2025 | Let's Talk →
3wThis hits HARD. 🎯 Been watching this shift for months. The real question isn't "will AI kill coding" - it's "what happens when everyone can build anything?" Here's the brutal truth: We're about to see a MASSIVE flood of software. But 99% will be garbage because building isn't the hard part anymore. The winners? Those who understand what's actually worth building. Strategy beats speed every time. What's your take - are we ready for this abundance?