Skip to content

ADR-009 - Deployment Environments

Statusaccepted
Date2025-04-15
Decision MakersDardan Bujupaj (AX), Marc Gähwiler (AX), Markus Brenner (Griesser)
Referenced ByADR-010 - Deployment Process
ADR-017 - Testing Strategy

Context

To effectively develop, test, and release software updates for the Sunny Calculator AddOn, we need a structured progression through different environments. Each environment serves a specific purpose, from initial development and testing to final user validation and production release. Furthermore, consistency with existing practices within Griesser simplifies tooling, processes, and understanding across teams.

Decision

We will adopt the following four standard deployment environments for the Sunny Calculator AddOn:

  1. Development (DEV): Used by developers for initial coding, feature development, and basic testing during the implementation phase (for example if you can’t run all services locally you can use this environment to develop against). Runs in a shared, possibly less stable/performant Azure cloud environment.
  2. Integration Test (INT): A shared environment where completed features are deployed and integrated. This is the primary environment for running automated integration and end-to-end tests against deployed code.
  3. User Acceptance Test (UAT): A stable environment mimicking production as closely as possible. Used by business stakeholders and end-users (from Griesser) to perform manual validation and ensure the software meets business requirements before release.
  4. Production (PROD): The live environment used by end-users (the sales team) for their daily work generating customer offerings.

This set of environments mirrors the standard adopted across other systems within Griesser.

Consequences

  • Benefits:
    • Provides distinct, well-defined stages for quality assurance (automated integration tests, manual user validation).
    • Aligns with Griesser’s company standards, promoting consistency and potentially leveraging shared tooling or knowledge.
    • Reduces the risk of deploying faulty code to production by enforcing progression through validation stages.
  • Drawbacks:
    • Requires managing and maintaining four separate environments (infrastructure configuration, data synchronization/anonymization challenges, deployment targets).
    • Introduces some latency in the overall release process due to the multiple stages.
  • Requires: Clear definition of the purpose, stability expectations, data management strategy, and access controls for each environment.

Alternatives

  • Fewer Environments (e.g., Dev, Staging, Prod):
    • Pros: Simpler infrastructure setup, potentially faster promotion path.
    • Cons: Combines automated integration testing and manual user acceptance into a single “Staging” environment, potentially leading to instability for UAT testers if automated tests fail frequently. Creates less distinct quality gates.
  • More Environments (e.g., adding dedicated Performance Test Env, setting up an environment for each branch):
    • Pros: Allows for isolated performance testing/feature development without impacting DEV, INT or UAT.
    • Cons: Increases infrastructure costs and maintenance overhead. Performance testing can potentially be conducted within INT or UAT for the initial phases if dedicated resources aren’t warranted yet.
  • Different Naming Conventions:
    • Pros: Could potentially use names perceived as clearer if the standard names were ambiguous.
    • Cons: Creates inconsistency with the rest of the Griesser ecosystem, potentially causing confusion, hindering shared tooling adoption, and complicating cross-system processes.