lakehouse/scripts/staffing/build_workers_v9.sh
root c3c9c2174a
Some checks failed
lakehouse/auditor 9 blocking issues: cloud: claim not backed — "Verified live (current synthetic data):"
staffing: B+C — safe views (candidates/workers/jobs) + workers_500k_v9 build script
Decision B from reports/staffing/synthetic-data-gap-report.md §7
(plus C: client_workerskjkk.parquet typo file removed from
data/datasets/ — was never tracked, no git effect).

PII enforcement was UNVERIFIED in workers_500k_v8 (the corpus
staffing_inference mode embeds chunks from). Verified 2026-04-27 by
inspecting data/vectors/meta/workers_500k_v8.json — `source:
"workers_500k"` confirms v8 was built directly from the raw table, so
the LLM has been seeing names / emails / phones / resume_text for every
staffing query.

This commit closes the boundary at the catalog metadata layer:

candidates_safe (overhauled — was failing SQL invalid 434×/day on a
nonexistent `vertical` column reference, copy-pasted from job_orders):
  drops last_name, email, phone, hourly_rate_usd
  candidate_id masked (keep first 3, last 2)
  row_filter: status != 'blocked'

workers_safe (NEW):
  drops name, email, phone, zip, communications, resume_text
  keeps role, city, state, skills, certifications, archetype, scores
  resume_text + communications carry verbatim PII (full names) and
  there is no in-view text scrubber, so they are dropped wholesale.
  Skills + certifications + scores carry the matching signal for
  staffing inference.

jobs_safe (NEW):
  drops description (often quotes client names verbatim)
  client_id masked (keep first 3, last 2)
  bill_rate / pay_rate kept — commercial info, not PII per staffing PRD

scripts/staffing/build_workers_v9.sh (NEW):
  POSTs /vectors/index to rebuild workers_500k_v9 from `workers_safe`
  rather than the raw table. Embedded text is constructed from the
  view projection so PII never enters the corpus by construction.
  30+ minute background job — not run inline. After it completes,
  flip config/modes.toml `staffing_inference` matrix_corpus from
  workers_500k_v8 to workers_500k_v9 and restart gateway.

Distillation v1.0.0 substrate untouched. audit-full passed clean
(16/16 required) before this commit; will re-verify after.
2026-04-27 10:46:03 -05:00

54 lines
2.2 KiB
Bash
Executable File
Raw Permalink Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

#!/usr/bin/env bash
# build_workers_v9.sh — Decision B (corpus rebuild side).
#
# Rebuilds workers_500k_v9 vector corpus from workers_safe view rather
# than the raw workers_500k table. Closes the PII enforcement gap
# (verified 2026-04-27 that v8 was built directly from raw — LLM saw
# names/emails/phones/resume_text for every staffing query).
#
# Run as a background job — embedding 500K chunks took ~4 min for v8
# of 50K rows; v9 of 500K rows will be 30+ min. Do not block on this.
#
# Usage:
# ./scripts/staffing/build_workers_v9.sh
# LH_GATEWAY=http://localhost:3100 ./scripts/staffing/build_workers_v9.sh
#
# After it completes:
# - Verify via: curl /vectors/indexes/workers_500k_v9 | jq
# - Flip config/modes.toml `staffing_inference` matrix_corpus to v9
# - Restart gateway to pick up the modes.toml change
set -euo pipefail
GATEWAY="${LH_GATEWAY:-http://localhost:3100}"
# The /vectors/index endpoint accepts {name, sql, embed_model, ...}.
# SQL pulls from workers_safe (see data/_catalog/views/workers_safe.json)
# so the embedded text never contained raw PII by construction.
#
# Concatenated text is what gets embedded — keep it short enough that
# 500K rows × N chunks fits in disk + memory budgets but still carries
# the match signal (role, location, skills, scores).
BODY=$(cat <<'JSON'
{
"name": "workers_500k_v9",
"sql": "SELECT CAST(worker_id AS VARCHAR) AS doc_id, CONCAT(role, ' in ', city, ', ', state, '. Skills: ', COALESCE(skills, ''), '. Certifications: ', COALESCE(certifications, ''), '. Archetype: ', COALESCE(archetype, ''), '. Scores — reliability ', CAST(reliability AS VARCHAR), ', responsiveness ', CAST(responsiveness AS VARCHAR), ', availability ', CAST(availability AS VARCHAR), '.') AS text FROM workers_safe",
"embed_model": "nomic-embed-text",
"chunk_size": 500,
"overlap": 50,
"source_dataset": "workers_safe",
"bucket": "primary"
}
JSON
)
echo "POSTing /vectors/index → workers_500k_v9 (background job)..."
curl -sS -X POST "${GATEWAY}/vectors/index" \
-H 'content-type: application/json' \
-d "$BODY"
echo
echo "Job started. Monitor progress:"
echo " curl ${GATEWAY}/vectors/indexes/workers_500k_v9 | jq"
echo " watch -n 5 'curl -s ${GATEWAY}/vectors/jobs | jq'"