Show HN: SciCraft – generate scientific Claude Code skills on demand (176 built)

  • Posted 8 hours ago by jaechang
  • 1 points
https://github.com/jaechang-hits/scicraft
Most Claude Code plugins ship a fixed set of skills and stop there. For general software development, that's fine. For scientific research, it's a fundamental mismatch.

  Every scientist works at a different intersection of tools. A computational
  biologist running GWAS uses a completely different stack than a structural
  biologist doing MD simulations, or a medicinal chemist running virtual
  screens, or a microscopist doing image segmentation. No static plugin can
  anticipate that breadth — and the moment your workflow touches a tool
  that isn't covered, the plugin becomes useless for that task.

  SciCraft is built around a CLAUDE.md that encodes a complete skill authoring
  workflow. Give Claude Code any scientific tool name:

      "Add a skill for CellRanger"
      "Add a skill for CREST structural variant caller"
      "Add a skill for our internal mass spec preprocessing pipeline"

  It runs a 6-step process:

      Topic → Classify (pipeline / toolkit / database / guide)
            → Category (pick from 11 life sciences domains)
            → Research (official docs, GitHub, PyPI)
            → Author (SKILL.md: 10+ runnable code blocks, Key Parameters
                      table, Troubleshooting matrix)
            → Register (registry.yaml entry)
            → Validate (pytest suite checks structure and code depth )

  The result is a CI-validated skill file committed to your repo — not a
  one-off answer, but a permanent reusable asset available in every future
  session. Your collaborators inherit it too.

  Scientific computing makes this especially valuable:
  - Life sciences libraries (Scanpy, RDKit, MDAnalysis, BioPython) have
    complex, non-obvious APIs that LLMs frequently hallucinate
  - The field moves fast — tools released last month have no training data
  - Lab-specific pipelines and instruments will never appear in any
    off-the-shelf plugin
  - Research scope shifts constantly: a lab pivoting from bulk RNA-seq to
    spatial transcriptomics needs a completely different skill set overnight

  The pytest suite enforces minimum code block counts, required section
  structure, parameter table depth, and troubleshooting rows — so generated
  skills are immediately usable, not rough drafts.

  ---

  The repo ships 176 pre-built scientific skills across genomics, drug
  discovery, proteomics, cell biology, and biostatistics (Scanpy, GATK,
  RDKit, DESeq2, AutoDock Vina, cBioPortal, gnomAD, and more) so common
  workflows are covered from day one. But the 176 are a starting point —
  the authoring system is what makes SciCraft adapt to your specific research,
  not just the research someone else anticipated.

  The design uses progressive disclosure: only each skill's description is
  in context during planning. The full file loads on demand. Context stays
  lean; precise scientific API knowledge is available when the agent needs it.

  Curious whether others have hit the ceiling of static plugin systems in
  research contexts — and what approaches you've tried for keeping domain
  knowledge current with a fast-moving field.

  https://github.com/jaechang-hits/scicraft

0 comments