techdocs/assessments/criteria.md

6.3 KiB
Raw Blame History

Assessment criteria and examples

Project documentation

Information architecture

The overall structure (pages/subpages/sections/subsections) of your project documentation. We evaluate on the following:

  • Is there high level conceptual/“About” content? Is the documentation feature complete? (i.e., each product feature is documented)
  • Are there step-by-step instructions (tasks, tutorials) documented for features?
  • Are there any key features which are documented but missing task documentation?
  • Is the “happy path”/most common use case documented? Does task and tutorial content demonstrate atomicity and isolation of concerns? (Are tasks clearly named according to user goals?) *If the documentation does not suffice, is there a clear escalation path for users needing more help? (FAQ, Troubleshooting)
  • If the product exposes an API, is there a complete reference?
  • Is content up to date and accurate?

Examples:

New user content

New users are the most avid users of documentation, and need content specifically for them. We evaluate on the following:

  • Is “getting started” clearly labeled? (“Getting started”, “Installation”, “First steps”, etc.)
  • Is installation documented step-by-step?
  • If needed, are multiple OSes documented?
  • Do users know where to go after reading the getting started guide?
  • Is your new user content clearly signposted on your sites homepage or at the top of your information architecture?
  • Is there easily copy-pastable sample code or other example content?

Examples:

Content maintainability & site mechanics

As a project scales, concerns like localized (translated) content and versioning become large maintenance burdens, particularly if you dont plan for them.

We evaluate on the following:

  • Is your documentation searchable?
  • Are you planning for localization/internationalization with regards to site directory structure? Is a localization framework present?
  • Do you have a clearly documented method for versioning your content?

Examples:

Content creation processes

Documentation is only as useful as it is accurate and well-maintained, and requires the same kind of review and approval processes as code.

We evaluate on the following:

  • Is there a clearly documented (ongoing) contribution process for documentation?
  • Does your code release process account for documentation creation & updates?
  • Who reviews and approves documentation pull requests?
  • Does the website have a clear owner/maintainer?

Examples:

Inclusive language

Creating inclusive project communities is a key goal for all CNCF projects.

We evaluate on the following:

  • Are there any customer-facing utilities, endpoints, class names, or feature names that use non-recommended words as documented by the Inclusive Naming Initiative website?
  • Does the project use language like "simple", "easy", etc.?

Contributor documentation

Communication methods documented

One of the easiest ways to attract new contributors is making sure they know how to reach you.

We evaluate on the following:

  • Is there a Slack/Discord/Discourse/etc. community and is it prominently linked from your website?
  • Is there a direct link to your GitHub organization/repository?
  • Are weekly/monthly project meetings documented? Is it clear how someone can join those meetings?
  • Are mailing lists documented?

Examples:

Beginner friendly issue backlog

We evaluate on the following:

  • Are docs issues well-triaged?
  • Is there a clearly marked way for new contributors to make code or documentation contributions (i.e. a “good first issue” label)?
  • Are issues well-documented (i.e., more than just a title)?
  • Are issues maintained for staleness?

Examples:

New contributor getting started content

Open source is complex and projects have many processes to manage that. Are processes easy to understand and written down so that new contributors can jump in easily?

We evaluate on the following:

  • Do you have a community repository or section on your website?
  • Is there a document specifically for new contributors/your first contribution?
  • Do new users know where to get help?

Examples:

Project governance documentation

One of the CNCFs core project values is open governance.

We evaluate on the following:

  • Is project governance clearly documented?

Examples:

  • Any graduated CNCF project

Website

Branding

CNCF seeks to support enterprise-ready open source software. A key aspect of this is branding and marketing.

We evaluate on the following:

  • Is there an easily recognizable brand for the project (logo + color scheme) clearly identifiable?
  • Is the brand used across the website consistently?
  • Is the websites typography clean and well-suited for reading?

Examples:

Case studies/social proof

One of the best ways to advertise an open source project is to show other organizations using it.

We evaluate on the following:

  • Are there case studies available for the project and are they documented on the website?
  • Are there user testimonials available?
  • Is there an active project blog?
  • Are there community talks for the project and are they present on the website?
  • Is there a logo wall of users/participating organizations?

Examples:

Maintenance planning

Website maintenance is an important part of project success, especially when project maintainers arent web developers.

We evaluate on the following:

  • Is your website tooling well supported by the community (i.e., Hugo with the Docsy theme) or commonly used by CNCF projects (our recommended tech stack?)
  • Are you actively cultivating website maintainers from within the community?
  • Are site build times reasonable?
  • Do site maintainers have adequate permissions?

Examples: