We all knew it

  • Ephera@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    6 months ago

    Traditionally (as in 20+ years ago), software got developed according to the Waterfall model or V-model or similar. This required a documentation of all the requirements before a project could be started. (Certain software development fields do still work that way due to legal requirements.)

    This was often a failure, because the requirements did not actually match what was needed, not to mention that the real requirements often shifted throughout development.

    Agile, on the other hand, starts out development and figuring out the requirements pretty much in parallel. The customers get a more tangible picture of what the software actually looks like. The software developers also take over the role of requirements engineers, of domain experts, which helps them make more fitting software architecture decisions. And you can more easily cancel a project, if it turns out to not be needed anymore or whatever (which is also why a cancellation percentage is misleading).

    The trouble with Agile, on the other hand, is that projects can get started with really no idea what the goal even is, and often with not enough budget reserved to actually get them completed (obviously, that may also be a failure state; if the project is promising enough, customers will find the money for it somewhere).
    Also, you do spend a lot of time as a software dev in working out those requirements.

    But yeah, basically pick your poison, and even if people like to complain about Agile methology, it’s what most of the software development world considers more successful.