Contributing
Set up a local development environment and contribute to Open Help Desk.
All contributions are welcome — bug reports, documentation improvements, and code changes. This page covers the technical setup. For design questions and feature discussions, open a GitHub issue first.
Before writing code
- Read DESIGN.md. It is the specification. Every feature and behavior described there is intentional. If something is ambiguous or you think the design is wrong, open an issue to discuss it — don't work around it.
- Check the issue tracker. Your bug may already be filed, or your feature may already be planned or deliberately excluded.
- Open an issue for non-trivial changes before submitting a PR. This prevents wasted effort if the direction isn't right.
- Tests are required. Write a failing test that defines "done" before implementing anything. No untested code will be merged.
Open Help Desk is licensed under the GNU Affero General Public License v3.0. By contributing you agree that your contributions will be released under the same license.
Development environment
Requirements
- Go 1.24+
- Node.js 24+ and npm
- PostgreSQL 17+ (local or Docker)
- sqlc (for schema changes)
Clone and set up
git clone https://github.com/PubliciaLLC/open-help-desk
cd open-help-desk
Backend
cd backend
go mod download
Create a local PostgreSQL database for development:
createdb helpdesk_dev
createdb helpdesk_test # separate database for tests
Start the backend server:
DATABASE_URL="postgres://localhost:5432/helpdesk_dev?sslmode=disable" \
BASE_URL="http://localhost:8080" \
SESSION_SECRET="dev-session-secret-change-me" \
JWT_SECRET="dev-jwt-secret-change-me" \
APP_ENV=development \
go run ./cmd/server
The server listens on :8080. It serves the embedded placeholder frontend (a minimal HTML page) until you also run the frontend dev server.
Frontend
Run the Vite dev server alongside the backend. It proxies /api and /mcp to the Go server on :8080 and serves the React app with hot module replacement.
cd frontend
npm ci
npm run dev # starts at http://localhost:5173
Open http://localhost:5173 in your browser. API calls proxy to :8080 automatically.
Optional: local email
To test email notifications locally without an SMTP server, use Mailpit:
docker run -d -p 1025:1025 -p 8025:8025 axllent/mailpit
Then add to your backend environment:
SMTP_HOST=localhost
SMTP_PORT=1025
SMTP_FROM=dev@localhost
View captured emails at http://localhost:8025.
Running tests
Unit tests
Unit tests live alongside the code they test (foo_test.go next to foo.go) and test the domain layer. No database is required.
cd backend
go test ./internal/domain/...
Integration tests
Integration tests hit a real PostgreSQL database. They live in internal/database/database_test.go and internal/server/server_test.go.
Via Docker Compose (recommended — no local Postgres needed):
docker-compose -f docker/docker-compose.yml --profile test run --rm test
From the host (port 5432 is exposed by docker-compose, or use a local Postgres):
TEST_DATABASE_URL="postgres://helpdesk:helpdesk@localhost:5432/helpdesk?sslmode=disable" \
go test ./...
If TEST_DATABASE_URL is not set, integration tests are skipped (not failed). This allows running unit tests in CI environments without a database.
Test isolation
Integration tests use rolled-back transactions. Each test wraps its database operations in a transaction via testutil.TxQueries(t, db), which automatically rolls back when the test ends. Tests do not depend on state left by other tests and can be run in any order.
There is no truncation step and no test fixtures to manage. Each test starts from a clean state by virtue of the rollback.
Running a single test
TEST_DATABASE_URL="..." go test ./internal/server/... -run TestCreateTicket -v
Frontend tests
Frontend unit tests are not yet set up. End-to-end testing of the UI is done manually or via the backend integration tests, which exercise the full HTTP layer.
Full test run
cd backend
go build ./... # compilation check
go vet ./... # static analysis
go test ./internal/domain/... ./internal/config/... ./internal/middleware/... ./internal/server/notify/... # unit tests
TEST_DATABASE_URL="..." go test ./... -race -count=1 # all tests with race detector
Schema changes
The project uses golang-migrate for schema migrations and sqlc for type-safe Go query generation. Never hand-edit files in backend/internal/dbgen/ — they are regenerated on every sqlc generate run.
Write the migration files
Add a numbered pair of migration files to backend/internal/database/migrations/:
0025_add_ticket_tags.up.sql
0025_add_ticket_tags.down.sql
Number sequentially after the last existing migration. Always write a .down.sql that cleanly reverses the .up.sql. Migrations run automatically on startup — no manual step is required.
-- 0025_add_ticket_tags.up.sql
CREATE TABLE ticket_tags (
ticket_id UUID NOT NULL REFERENCES tickets(id) ON DELETE CASCADE,
tag TEXT NOT NULL,
PRIMARY KEY (ticket_id, tag)
);
-- 0025_add_ticket_tags.down.sql
DROP TABLE ticket_tags;
Write the SQL queries
Add queries to the appropriate file in backend/queries/. Each query must have a sqlc annotation comment.
-- name: GetTicketTags :many
SELECT tag FROM ticket_tags
WHERE ticket_id = $1
ORDER BY tag;
-- name: AddTicketTag :exec
INSERT INTO ticket_tags (ticket_id, tag)
VALUES ($1, $2)
ON CONFLICT DO NOTHING;
-- name: RemoveTicketTag :exec
DELETE FROM ticket_tags
WHERE ticket_id = $1 AND tag = $2;
Regenerate Go types
cd backend && sqlc generate
This regenerates backend/internal/dbgen/. Review the diff — the generated code should match what you expect from your SQL. If the output looks wrong, fix the SQL before proceeding.
Add a store method
Add a method to the relevant store in backend/internal/database/ that calls the generated query functions. Add the method signature to the interface in the corresponding domain/ package if the domain layer needs access.
// In backend/internal/database/ticketstore/store.go
func (s *Store) GetTags(ctx context.Context, ticketID uuid.UUID) ([]string, error) {
rows, err := s.q.GetTicketTags(ctx, ticketID)
if err != nil {
return nil, fmt.Errorf("getting ticket tags: %w", err)
}
return rows, nil
}
Write integration tests before using the new code
Add tests in backend/internal/database/database_test.go that exercise the new store methods against a real database using testutil.TxQueries. The tests must pass before the new code is used in domain services or HTTP handlers.
Code conventions
Project layout
backend/
cmd/server/ -- main package: wires everything, starts the server
internal/
config/ -- env-var config loading
database/ -- pgxpool connection, migrations, store implementations
dbgen/ -- sqlc-generated code (never hand-edit)
domain/ -- business logic (no HTTP, no DB imports)
mcp/ -- MCP server layer
middleware/ -- HTTP middleware (session auth, API key auth)
server/ -- HTTP handlers and routing
testutil/ -- shared test helpers
ui/ -- embedded React SPA (go:embed)
queries/ -- raw SQL for sqlc
The domain packages are the heart of the system. They must not import database, server, or any infrastructure package. All dependencies flow inward — domain code calls store interfaces, not store implementations.
Error handling
- Always return errors. Never silently swallow them.
- Wrap errors with context:
fmt.Errorf("creating ticket: %w", err). - Log only at the boundary (HTTP handler or
main). Domain code returns errors — it does not log them. - Map known domain errors to HTTP status codes in the handler or in
server/respond.go. Unknown errors become 500.
No global state
No init() functions with side effects. No package-level variables that change at runtime. All dependencies — stores, services, configs — are passed explicitly through constructors and function parameters. If the wiring is verbose, that's fine: it's honest.
Testing conventions
- Use Go's stdlib
testingpackage. No third-party assertion libraries. - Table-driven tests for domain logic and HTTP handlers. One table, multiple cases.
- Integration tests use
testutil.TxQueries— always rolled back, never rely on pre-existing data. - Never mock the database. Real queries catch real bugs. See CLAUDE.md for the reasoning.
- Test the contract (status codes, JSON shape, domain behavior), not the implementation.
Naming
- Short, precise variable names. A variable that lives for 3 lines doesn't need a paragraph.
- Exported types are for things that genuinely cross package boundaries. Default to unexported.
- Unexported functions are the default. Export only what's needed.
Scope discipline
Before adding anything, ask: Is it in DESIGN.md? If not, it's out of scope for the current version. The roadmap has explicit v1/v2/v3/v4 milestones. Do not implement v2+ features ahead of schedule, even if it "seems easy to add now." The cost is always higher than it looks — it lands in the test suite, the data model, and every future reader's mental model.
PR process
- Fork and branch — create a feature branch from
main. Name it descriptively:feat/webhook-retries,fix/sla-timer-pause. - Keep the diff small — one logical change per PR. Large PRs are hard to review and slow to merge. Split refactoring from feature work.
- Tests green — run the full test suite locally before opening the PR. The CI will also run it, but don't open a PR you know will fail.
- Write a clear description — explain what the change does and why. Link to the relevant issue. If there's a non-obvious design decision, explain it in the PR body, not just in a code comment.
- Respond to review comments promptly — if you disagree with a comment, explain why. If you agree, fix it. Don't let PRs stall.
What gets merged quickly
- Bug fixes with a regression test
- Documentation improvements
- Features explicitly called out in DESIGN.md for the current version
- Small, well-scoped refactors with clear motivation
What gets rejected
- Features not in DESIGN.md without prior discussion
- v2/v3/v4 features being added ahead of schedule
- Code without tests
- Breaking changes to existing API behavior without a migration path
- Speculative abstractions ("I added this interface because we might need it later")