Skip to content
Merged
Show file tree
Hide file tree
Changes from 12 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
35 changes: 35 additions & 0 deletions .github/workflows/docs-metadata-guard.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
name: Docs Metadata Guard

on:
pull_request:
paths:
- "data/docs/**"
- "next.config.js"
- "scripts/check-docs-metadata.js"
- "tests/docs-metadata.test.js"
- "tests/fixtures/**"
- "package.json"
- ".github/workflows/docs-metadata-guard.yml"

jobs:
check-metadata:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v5
with:
fetch-depth: 0

- name: Set up Node.js
uses: actions/setup-node@v6
with:
node-version: "20"

- name: Install dependencies
run: yarn install --frozen-lockfile --non-interactive

- name: Run docs metadata tests
run: yarn test:docs-metadata

- name: Validate docs metadata
run: yarn check:docs-metadata
10 changes: 9 additions & 1 deletion .husky/pre-commit
Original file line number Diff line number Diff line change
@@ -1,15 +1,23 @@
#!/usr/bin/env sh

set -e

echo 'husky (pre-commit): running lint-staged'
yarn lint-staged

STAGED_FILES=$(git diff --cached --name-only)

# Check for docs redirect changes
if printf '%s\n' "$STAGED_FILES" | grep -E '^(data/docs/.*\.mdx|next\.config\.js|scripts/check-doc-redirects\.js)$' >/dev/null; then
echo 'husky (pre-commit): verifying docs redirects'
yarn check:doc-redirects
else
echo 'husky (pre-commit): skipping docs redirect check (no relevant changes staged)'
fi

# Check for docs metadata
if printf '%s\n' "$STAGED_FILES" | grep -E '^data/docs/.*\.mdx$' >/dev/null; then
echo 'husky (pre-commit): validating docs metadata'
HUSKY_PRE_COMMIT=true yarn check:docs-metadata
else
echo 'husky (pre-commit): skipping docs metadata check (no documentation changes staged)'
fi
3 changes: 3 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,9 +39,11 @@ Thanks for helping improve SigNoz documentation. Clear, complete docs are critic
- Pre-commit behavior
- Runs `lint-staged` on staged files. ESLint and Prettier fix typical JS/TS/MD/MDX formatting and lint issues.
- When changes include docs or redirect-related files (`data/docs/**/*.mdx`, `next.config.js`, or `scripts/check-doc-redirects.js`), it runs `yarn check:doc-redirects` to ensure renamed/moved docs have permanent redirects.
- When changes include docs (`data/docs/**/*.mdx`), it runs `yarn check:docs-metadata` to ensure metadata such as date, tag, title is complete and correct.
- Fixing failures
- Lint/format: run `yarn lint` or re-stage after auto-fixes from Prettier/ESLint.
- Redirects: run `yarn check:doc-redirects` locally to see missing entries, then add a permanent redirect in `next.config.js` under `async redirects()`. Re-stage and commit.
- Metadata: run `yarn check:docs-metadata` locally to see missing/invalid entries, then update the metadata in the `.mdx` file. Re-stage and commit.
- Optional: `yarn test:doc-redirects` runs a small test for redirect rules.
- Hooks path
- The repo uses Husky v9 defaults (`core.hooksPath=.husky`). If your local Git still points elsewhere (e.g., `.husky/_` from older setups), run `git config core.hooksPath .husky` or re-run `yarn install` to refresh hooks.
Expand Down Expand Up @@ -163,6 +165,7 @@ Every doc should be skimmable and actionable.
```

- Use descriptive anchor text that makes the link destination clear. Avoid generic phrases like "here" or "link" and do not paste raw URLs into the body text.

- ✅ `Learn from the [Temporal Golang sample repository](https://github.com/SigNoz/temporal-golang-opentelemetry/tree/main)`
- ❌ `See (link)` or `Refer to https://github.com/...`

Expand Down
2 changes: 1 addition & 1 deletion data/docs/mastra-observability.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -143,4 +143,4 @@ When you click on a trace in SigNoz, you'll see a detailed view of the trace, in

You can also check out our custom Mastra dashboard [here](https://signoz.io/docs/dashboards/dashboard-templates/mastra-dashboard/) which provides specialized visualizations for monitoring your Masrta usage in applications. The dashboard includes pre-built charts specifically tailored for LLM usage, along with import instructions to get started quickly.

<Figure src="/img/docs/llm/mastra/mastra-dashboard.webp" alt="Mastra Dashboard" caption="Mastra Dashboard Template" />
<Figure src="/img/docs/llm/mastra/mastra-dashboard.webp" alt="Mastra Dashboard" caption="Mastra Dashboard Template" />
2 changes: 2 additions & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@
"lint": "next lint --fix --dir pages --dir app --dir components --dir lib --dir layouts --dir scripts",
"check:doc-redirects": "node scripts/check-doc-redirects.js",
"test:doc-redirects": "node --test tests/doc-redirects.test.js",
"check:docs-metadata": "node scripts/check-docs-metadata.js",
"test:docs-metadata": "node --test tests/docs-metadata.test.js",
"prepare": "husky"
},
"dependencies": {
Expand Down
301 changes: 301 additions & 0 deletions scripts/check-docs-metadata.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,301 @@
#!/usr/bin/env node

const { execSync } = require('child_process')
const fs = require('fs')

function run(command) {
try {
return execSync(command, { encoding: 'utf8' }).trim()
} catch (error) {
console.error(`Failed to execute: ${command}`)
console.error(error.message)
process.exit(1)
}
}

function getChangedDocFiles(baseRef) {
let mergeBase
try {
mergeBase = run(`git merge-base HEAD ${baseRef}`)
} catch (error) {
if (baseRef !== 'origin/main') {
mergeBase = run('git merge-base HEAD origin/main')
} else {
throw error
}
}

const docPattern = /^data\/docs\/.*\.mdx$/
const changedFiles = new Set()

// Get committed changes
try {
const committedDiff = execSync(`git diff --name-only --diff-filter=ACMR ${mergeBase} HEAD`, {
encoding: 'utf8',
})
committedDiff
.split('\n')
.filter((file) => docPattern.test(file))
.forEach((file) => changedFiles.add(file))
} catch (error) {
console.error('Unable to read git diff for docs changes.')
console.error(error.message)
process.exit(1)
}

// Get working tree changes
try {
const workingDiff = execSync('git diff --name-only --diff-filter=ACMR HEAD', {
encoding: 'utf8',
})
workingDiff
.split('\n')
.filter((file) => docPattern.test(file))
.forEach((file) => changedFiles.add(file))
} catch (error) {
console.error('Unable to read local git diff for docs changes.')
console.error(error.message)
process.exit(1)
}

return Array.from(changedFiles).filter(Boolean)
}

function getGitAuthorDate(filePath) {
try {
const dateString = execSync(`git log -2 --pretty=format:%as -- ${filePath}`, {
encoding: 'utf8',
}).trim()
return dateString || null
} catch (error) {
return null
}
}

function getStagedDocFiles() {
try {
const stagedFiles = execSync('git diff --cached --name-only --diff-filter=ACMR', {
encoding: 'utf8',
})
const docPattern = /^data\/docs\/.*\.mdx$/
return stagedFiles
.split('\n')
.filter((file) => docPattern.test(file))
.filter(Boolean)
} catch (error) {
console.error('Unable to read staged files.')
console.error(error.message)
process.exit(1)
}
}

function extractFrontmatter(filePath) {
try {
const content = fs.readFileSync(filePath, 'utf8')
const lines = content.split('\n')
let inFrontmatter = false
let frontmatterLines = []
let delimiterCount = 0

for (const line of lines) {
if (line.trim() === '---') {
delimiterCount++
if (delimiterCount === 1) {
inFrontmatter = true
continue
}
if (delimiterCount === 2) {
break
}
}
if (inFrontmatter && delimiterCount === 1) {
frontmatterLines.push(line)
}
}

return frontmatterLines.join('\n')
} catch (error) {
return null
}
}

function validateMetadata(filePath) {
const errors = []
const warnings = []

// Check if file exists
if (!fs.existsSync(filePath)) {
errors.push('file not found')
return { errors, warnings }
}

// Extract frontmatter
const frontmatter = extractFrontmatter(filePath)
if (frontmatter === null) {
errors.push('cannot read file')
return { errors, warnings }
}

const lines = frontmatter.split('\n')
const fieldMap = new Map()

// Parse frontmatter fields
for (const line of lines) {
const match = line.match(/^(\w+):\s*(.*)$/)
if (match) {
fieldMap.set(match[1], match[2].trim())
}
}

// Validate tags field (warning only)
if (!fieldMap.has('tags')) {
warnings.push('missing tags')
} else {
const tagsValue = fieldMap.get('tags')
if (!tagsValue.includes('[')) {
warnings.push('tags must be an array')
} else if (/^\[\s*\]$/.test(tagsValue)) {
warnings.push('tags array cannot be empty')
}
}

// Validate date field (required)
if (!fieldMap.has('date')) {
errors.push('missing date')
} else {
const dateValue = fieldMap.get('date').replace(/['"]/g, '').trim()
const datePattern = /^\d{4}-\d{2}-\d{2}$/
if (!datePattern.test(dateValue)) {
errors.push('invalid date format - use YYYY-MM-DD')
} else {
// Check if date is valid
const date = new Date(dateValue)
if (isNaN(date.getTime())) {
errors.push('invalid date value')
} else {
// Allow dates up to 7 days in the future
const today = new Date()
today.setHours(0, 0, 0, 0)

const maxFutureDate = new Date(today)
maxFutureDate.setDate(maxFutureDate.getDate() + 7)

if (date > maxFutureDate) {
errors.push('date cannot be more than 7 days in the future')
}
}
}
}

// New validation: Compare frontmatter date with git commit date
if (fieldMap.has('date')) {
const frontmatterDate = fieldMap.get('date').replace(/['"]/g, '').trim()
const gitDate = getGitAuthorDate(filePath)

if (gitDate) {
const frontDate = new Date(frontmatterDate)
const commitDate = new Date(gitDate)

if (frontDate < commitDate) {
warnings.push(
`frontmatter date (${frontmatterDate}) is before git commit date (${gitDate})`
)
}
}
}

// Validate title field (required)
if (!fieldMap.has('title')) {
errors.push('missing title')
} else {
const titleValue = fieldMap.get('title').trim()
if (!titleValue || titleValue === '""' || titleValue === "''") {
errors.push('title cannot be empty')
}
}

return { errors, warnings }
}

function main() {
const isPreCommit = process.env.HUSKY_PRE_COMMIT === 'true'
const baseBranch = process.env.GITHUB_BASE_REF
? `origin/${process.env.GITHUB_BASE_REF}`
: process.env.DEFAULT_BRANCH || 'origin/main'

// Get changed files
const changedFiles = isPreCommit ? getStagedDocFiles() : getChangedDocFiles(baseBranch)

if (changedFiles.length === 0) {
console.log('No documentation files to check')
return
}

console.log(`Checking ${changedFiles.length} documentation file(s) for required metadata...\n`)

const invalidFiles = []
const warningFiles = []
let allValid = true

for (const file of changedFiles) {
const { errors, warnings } = validateMetadata(file)

if (errors.length > 0) {
console.error(`❌ ${file}: ${errors.join('; ')}`)
invalidFiles.push({ file, issues: errors })
allValid = false
}

if (warnings.length > 0) {
console.warn(`⚠️ ${file}: ${warnings.join('; ')}`)
warningFiles.push({ file, issues: warnings })
}

if (errors.length === 0 && warnings.length === 0) {
console.log(`✅ ${file}`)
}
}

console.log('')

// Display summary
if (warningFiles.length > 0) {
console.warn('Documentation metadata warnings:')
warningFiles.forEach(({ file, issues }) => {
console.warn(` • ${file}: ${issues.join('; ')}`)
})
console.warn('\nConsider adding tags to improve documentation discoverability.\n')
}

if (!allValid) {
console.error('Documentation metadata validation failed:')
invalidFiles.forEach(({ file, issues }) => {
console.error(` • ${file}: ${issues.join('; ')}`)
})
console.error('\nRequired fields:')
console.error(' - date: Date in YYYY-MM-DD format')
console.error(' - title: Non-empty title field')
console.error(' - tags: Array of tags (recommended)')
console.error('\nExample:')
console.error('---')
console.error('title: My Documentation Page')
console.error(`date: ${new Date().toISOString().split('T')[0]}`)
console.error('tags: ["SigNoz Cloud", "Self-Host"]')
console.error('---\n')
process.exit(1)
}

console.log('✅ All documentation files have valid metadata\n')
}

module.exports = {
getChangedDocFiles,
getStagedDocFiles,
extractFrontmatter,
validateMetadata,
main,
}

if (require.main === module) {
main()
}
Loading