A lot of applications are essentially data structures with a user interfacce.
That is not to say that a lot of other work does not go into them, of course. But given a sufficiently standardized (or perhaps flexible) framework, quite a few real-world software platforms could be reduced to essentially a complex data structure. For instance, Mastodon or BlueSky -style social media platforms can be (at a very high level) treated essentially as lists of posts, with logic for both filtering those lists and merging multiple lists into one, as well as pushing and popping items, paging, etc.
There is nothing wrong with this, of course. Data management and dthe development of data structures that properly represent real-world items and activities is a hard problem, and developing and refining the proper data structure for a particular task is a real contribution. Often, that is where most of the truly hard conceptual work in software development goes, at least during the design and development phases.
However, modern software development ignores how little of a typical application actually requires novel code, or how little of it would require novel code if we had better frameworks and other off-the-shelf tools to handle boilerplate. User interfaces work best when they follow the principle of least surprise, which is a strong argument for standardizing along the lines of existing conventions — a process which has already progressed quite a ways informally, and would often be more a question of documenting established conventions than of prescribing anything. This standardization would in turn improve the offering of drop-in UI frameworks, to the point where simple or MVP applications might require little or no UI work at all, much as frameworks like Django already abstract away much of the complexity of database management in non-extreme use cases.
If you squint hard enough, you can sort of see this as one of the factors driving otherwise competent developers towards so-called "AI" "tools": developers realize that real-world software development requires writing far more code than the problem actually should require, but real software libraries written by real humans that would make that boilerplate disappear either do not exist, lack good enough documentation, or are unusable for reason X, or would require adding another dependency to the project and we know how that usually goes. This is a problem we can solve, but it will need to be done carefully and deliberately, with thought put into every step. There is no room for vibe coding here.
Another implication of this is the importance of modularity, including the modularity of user interfaces. There is no reason why a user should not be able to pick and choose the UI through which they use any particular tool, especially when the UI contains very little that is actually specific to the business logic. There is no reason one should not be able to choose one UI theme for their phone or computer, and have it applied to every app and website they use.
This is something everyone working on or using software should keep in mind in an age where every bank, grocery store, and rail carrier has their own app, which is presented to the user as a monolithic, company-controlled storefront that one has to take or leave — and increasingly, leaving means not participating in an entire sector of society, which is a good option for nobody and not an option at all for most people. There is a reason why I never asked why software was less modular than it could be: the straight answer would be something like "well, capitalism...", and everyone with even a little bit of knowledge of the real world knows that.
Unfortunately, this also means that the solution must be social and/or political, not simply technical — people must again treat computers and tools that serve their users, not extensions of corporate or government authorities that have to be adapted to.
The core of what people do on computer is manipulate data, and that is what software should be optimized for. Sometimes the data being manipulated needs to have a very specific appearance for artistic or infographical purposes, but most of the time it can fit within well established presentational conventions. The principle of least surprise is not just a best practice, but an essential tenet of usability.
We need to work to build a future where those in the software world understand and respect this fact and their users, and where users have the freedom and ability to interact with computers in a manner best suited to them as individuals.