Contributing

Set up a local development environment and contribute to Open Help Desk.

All contributions are welcome — bug reports, documentation improvements, and code changes. This page covers the technical setup. For design questions and feature discussions, open a GitHub issue first.

Before writing code

  1. Read DESIGN.md. It is the specification. Every feature and behavior described there is intentional. If something is ambiguous or you think the design is wrong, open an issue to discuss it — don't work around it.
  2. Check the issue tracker. Your bug may already be filed, or your feature may already be planned or deliberately excluded.
  3. Open an issue for non-trivial changes before submitting a PR. This prevents wasted effort if the direction isn't right.
  4. Tests are required. Write a failing test that defines "done" before implementing anything. No untested code will be merged.

Open Help Desk is licensed under the GNU Affero General Public License v3.0. By contributing you agree that your contributions will be released under the same license.

Development environment

Requirements

Clone and set up

git clone https://github.com/PubliciaLLC/open-help-desk
cd open-help-desk

Backend

cd backend
go mod download

Create a local PostgreSQL database for development:

createdb helpdesk_dev
createdb helpdesk_test  # separate database for tests

Start the backend server:

DATABASE_URL="postgres://localhost:5432/helpdesk_dev?sslmode=disable" \
BASE_URL="http://localhost:8080" \
SESSION_SECRET="dev-session-secret-change-me" \
JWT_SECRET="dev-jwt-secret-change-me" \
APP_ENV=development \
go run ./cmd/server

The server listens on :8080. It serves the embedded placeholder frontend (a minimal HTML page) until you also run the frontend dev server.

Frontend

Run the Vite dev server alongside the backend. It proxies /api and /mcp to the Go server on :8080 and serves the React app with hot module replacement.

cd frontend
npm ci
npm run dev  # starts at http://localhost:5173

Open http://localhost:5173 in your browser. API calls proxy to :8080 automatically.

Optional: local email

To test email notifications locally without an SMTP server, use Mailpit:

docker run -d -p 1025:1025 -p 8025:8025 axllent/mailpit

Then add to your backend environment:

SMTP_HOST=localhost
SMTP_PORT=1025
SMTP_FROM=dev@localhost

View captured emails at http://localhost:8025.

Running tests

Unit tests

Unit tests live alongside the code they test (foo_test.go next to foo.go) and test the domain layer. No database is required.

cd backend
go test ./internal/domain/...

Integration tests

Integration tests hit a real PostgreSQL database. They live in internal/database/database_test.go and internal/server/server_test.go.

Via Docker Compose (recommended — no local Postgres needed):

docker-compose -f docker/docker-compose.yml --profile test run --rm test

From the host (port 5432 is exposed by docker-compose, or use a local Postgres):

TEST_DATABASE_URL="postgres://helpdesk:helpdesk@localhost:5432/helpdesk?sslmode=disable" \
  go test ./...

If TEST_DATABASE_URL is not set, integration tests are skipped (not failed). This allows running unit tests in CI environments without a database.

Test isolation

Integration tests use rolled-back transactions. Each test wraps its database operations in a transaction via testutil.TxQueries(t, db), which automatically rolls back when the test ends. Tests do not depend on state left by other tests and can be run in any order.

There is no truncation step and no test fixtures to manage. Each test starts from a clean state by virtue of the rollback.

Running a single test

TEST_DATABASE_URL="..." go test ./internal/server/... -run TestCreateTicket -v

Frontend tests

Frontend unit tests are not yet set up. End-to-end testing of the UI is done manually or via the backend integration tests, which exercise the full HTTP layer.

Full test run

cd backend
go build ./...                                                         # compilation check
go vet ./...                                                           # static analysis
go test ./internal/domain/... ./internal/config/... ./internal/middleware/... ./internal/server/notify/...  # unit tests
TEST_DATABASE_URL="..." go test ./... -race -count=1                   # all tests with race detector

Schema changes

The project uses golang-migrate for schema migrations and sqlc for type-safe Go query generation. Never hand-edit files in backend/internal/dbgen/ — they are regenerated on every sqlc generate run.

1

Write the migration files

Add a numbered pair of migration files to backend/internal/database/migrations/:

0025_add_ticket_tags.up.sql
0025_add_ticket_tags.down.sql

Number sequentially after the last existing migration. Always write a .down.sql that cleanly reverses the .up.sql. Migrations run automatically on startup — no manual step is required.

-- 0025_add_ticket_tags.up.sql
CREATE TABLE ticket_tags (
    ticket_id UUID NOT NULL REFERENCES tickets(id) ON DELETE CASCADE,
    tag       TEXT NOT NULL,
    PRIMARY KEY (ticket_id, tag)
);

-- 0025_add_ticket_tags.down.sql
DROP TABLE ticket_tags;
2

Write the SQL queries

Add queries to the appropriate file in backend/queries/. Each query must have a sqlc annotation comment.

-- name: GetTicketTags :many
SELECT tag FROM ticket_tags
WHERE ticket_id = $1
ORDER BY tag;

-- name: AddTicketTag :exec
INSERT INTO ticket_tags (ticket_id, tag)
VALUES ($1, $2)
ON CONFLICT DO NOTHING;

-- name: RemoveTicketTag :exec
DELETE FROM ticket_tags
WHERE ticket_id = $1 AND tag = $2;
3

Regenerate Go types

cd backend && sqlc generate

This regenerates backend/internal/dbgen/. Review the diff — the generated code should match what you expect from your SQL. If the output looks wrong, fix the SQL before proceeding.

4

Add a store method

Add a method to the relevant store in backend/internal/database/ that calls the generated query functions. Add the method signature to the interface in the corresponding domain/ package if the domain layer needs access.

// In backend/internal/database/ticketstore/store.go

func (s *Store) GetTags(ctx context.Context, ticketID uuid.UUID) ([]string, error) {
    rows, err := s.q.GetTicketTags(ctx, ticketID)
    if err != nil {
        return nil, fmt.Errorf("getting ticket tags: %w", err)
    }
    return rows, nil
}
5

Write integration tests before using the new code

Add tests in backend/internal/database/database_test.go that exercise the new store methods against a real database using testutil.TxQueries. The tests must pass before the new code is used in domain services or HTTP handlers.

Code conventions

Project layout

backend/
  cmd/server/         -- main package: wires everything, starts the server
  internal/
    config/           -- env-var config loading
    database/         -- pgxpool connection, migrations, store implementations
    dbgen/            -- sqlc-generated code (never hand-edit)
    domain/           -- business logic (no HTTP, no DB imports)
    mcp/              -- MCP server layer
    middleware/       -- HTTP middleware (session auth, API key auth)
    server/           -- HTTP handlers and routing
    testutil/         -- shared test helpers
    ui/               -- embedded React SPA (go:embed)
  queries/            -- raw SQL for sqlc

The domain packages are the heart of the system. They must not import database, server, or any infrastructure package. All dependencies flow inward — domain code calls store interfaces, not store implementations.

Error handling

No global state

No init() functions with side effects. No package-level variables that change at runtime. All dependencies — stores, services, configs — are passed explicitly through constructors and function parameters. If the wiring is verbose, that's fine: it's honest.

Testing conventions

Naming

Scope discipline

Before adding anything, ask: Is it in DESIGN.md? If not, it's out of scope for the current version. The roadmap has explicit v1/v2/v3/v4 milestones. Do not implement v2+ features ahead of schedule, even if it "seems easy to add now." The cost is always higher than it looks — it lands in the test suite, the data model, and every future reader's mental model.

PR process

  1. Fork and branch — create a feature branch from main. Name it descriptively: feat/webhook-retries, fix/sla-timer-pause.
  2. Keep the diff small — one logical change per PR. Large PRs are hard to review and slow to merge. Split refactoring from feature work.
  3. Tests green — run the full test suite locally before opening the PR. The CI will also run it, but don't open a PR you know will fail.
  4. Write a clear description — explain what the change does and why. Link to the relevant issue. If there's a non-obvious design decision, explain it in the PR body, not just in a code comment.
  5. Respond to review comments promptly — if you disagree with a comment, explain why. If you agree, fix it. Don't let PRs stall.

What gets merged quickly

What gets rejected

← Plugin Development