MoreRSS

site iconThe Practical DeveloperModify

A constructive and inclusive social network for software developers.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of The Practical Developer

The Future of Cyber Resilience for Complex AWS Environments is Here

2025-11-27 23:05:48

2025 has seen the cloud landscape continue to evolve at an extraordinary pace. As organizations accelerate their AI, analytics, and digital transformation workloads, many of us are experiencing a significant increase in complexity.

Systems are becoming more distributed, with workloads spread across multiple regions, accounts, and vendors. With complexity comes fragmentation, and a sharp rise in risk around cyber threats, identity compromise, and multi-cloud governance, leading many of us to wonder how to maintain visibility across disparate systems as well as how to handle protection, resilience, and recovery at scale.

This is why I was excited to learn about some of the latest announcements and releases from Commvault, announced at SHIFT 2025.

Commvault Cloud Unity - unifying data security, cyber recovery, and identity resilience into one AI-enabled platform

Commvault Cloud Unity - a unified platform designed for the realities of cloud at scale

Most organizations’ AWS environments have grown organically, spanning multiple accounts and regions, with the vast majority using multiple cloud vendors as well as running hosted workloads and data on-premises. This approach allows for the adoption of best-of-breed technologies and services, however the trade-off is that such mixed environments become increasingly difficult to manage and protect.

Commvault Cloud Unity is a major release that unifies data security, cyber recovery, and identity resilience into one AI-enabled platform. It provides a single pane of glass spanning all workloads, regions, and protection policies, across AWS, on-premises, and hybrid environments.

Features of the Commvault Cloud Unity platform:

AI-driven discovery of all AWS workloads across accounts and regions

Commvault Cloud Unity automatically identifies AWS workloads and data across EC2, EKS, RDS, DynamoDB, S3, Lambda-backed services, and more.

Addressing the challenge of understanding where data is located, and what’s protected

Clear visibility into what’s protected (and what isn’t)

One of the biggest challenges is understanding where data is located. What’s protected? What’s under-protected, or not protected at all? In addition to helping you discover your data landscape, Commvault Cloud Unity also provides automated classification and protection policy recommendations.

Synthetic Recovery: Clean, Complete Restorations for AWS Workloads

This is, in my view, one of the most exciting capabilities Commvault has introduced.

AWS estates often include:

  • Distributed EC2 workloads
  • Massive S3 data lakes
  • Numerous databases (RDS, Aurora, DynamoDB)
  • Containerized workloads running on EKS

If any part of this is compromised, restoring cleanly can be incredibly complex and nuanced. Previously, you’d have to choose between an older backup that’s clean, or a recent snapshot that might be contaminated. Neither option is great.

Between an older backup that’s clean, or a recent snapshot that might be contaminated, neither option is great.

Synthetic Recovery changes that completely.

It uses AI to identify compromised blocks or files, remove them automatically, then reassemble them into a synthetically clean recovery point, preserving all clean, recent data. This is incredibly valuable for AWS environments where speed and precision are essential.

No more rolling back to a recovery point from last Tuesday because it’s the only one you trust.

Request a demo of this exciting feature to see it in action!

Threat detection across cloud providers

Threat Scan: Protecting AWS Backups from Hidden Threats

For AWS customers maintaining vast amounts of data in S3 or using snapshot-heavy workflows for EC2 and RDS, this adds vital intelligence to the recovery process.

Threat Scan brings AI-driven scanning of AWS backup datasets, detection of encrypted files, malware, and indicators of compromise, the ability to inspect recovery points before restoring them, proactive identification of risks inside S3 object versions, EC2 snapshots, and more.

With attackers now targeting backups directly, the security of AWS backup data has never been more critical.

80% of attacks involve an identity breach!

Identity Resilience for AWS-Hybrid Environments

AWS customers who rely on Active Directory for authentication, whether that’s through AWS Managed AD or integrated with on-premises AD, will benefit from new identity resilience enhancements, which detect, audit, and reverse malicious identity changes.

Commvault Unity also includes the ability to spot identity anomalies, maintain forensic change logs, roll back malicious AD changes in real time, and even safely test AD recovery inside a cleanroom on AWS. All of this is invaluable for anyone operating a hybrid IAM setup on AWS.

Solving the challenges that AWS customers struggle with the most

Collectively, these announcements represent a major step forward for AWS resilience. They bring clarity where there has been confusion, automation where there has been manual effort, and integrated protection where there have been fragmented tools.

**Commvault Cloud Unity solves the challenges that AWS customers struggle with the most, like data sprawl, inconsistent policies, cyber risk, and complex backup management. **With one secure, automated platform spanning hybrid and multi-cloud environments, organizations benefit from faster recovery, streamlined operations, and complete confidence that their critical data is properly protected and recoverable when it matters most.

Want to Learn More?

Exciting times for Commvault, for AWS, and for those of us responsible for mission critical workloads in the cloud! If you’re interested in hearing more about all of these announcements, you can watch all the sessions from SHIFT 2025 on demand, and request a demo!

Commvault at re:Invent 2025

If you’re heading to AWS re:Invent this year, visit the Commvault team in the Expo Hall at booth #621 to talk cyber recovery and AWS-native resilience, experience some very cool demos, and more!

🚀Terraform State File Management with AWS S3

2025-11-27 22:52:53

🧩 What Is the Terraform State File?

Whenever Terraform builds your AWS infrastructure, it needs a way to remember what it created.
That memory is stored in a file called:

terraform.tfstate

This file tracks:

  • EC2 instances
  • S3 buckets
  • IAM roles
  • Databases
  • And their metadata

Terraform uses this file to compare:

  • Desired State (your .tf files)
  • Actual State (what exists in AWS)

❌ Why You Should NOT Store State Files Locally
🔐 1. Security Risk
State file contains sensitive info like:

  • AWS account IDs
  • Secrets
  • Passwords
  • ARNs

Keeping it on your laptop? Yeah… risky.
👥 2. Team Collaboration Issues

Local state = conflicts, overwrites, broken infra.
💥 3. Data Loss

If your laptop dies or state file is deleted, Terraform loses track of your cloud resources.

☁️ The Solution: Remote Backend Using AWS S3
A remote backend stores your state file in S3 instead of on your machine.

Benefits include:

✔ Secure, encrypted storage
✔ State locking
✔ Team collaboration
✔ Backups via S3 versioning
✔ Environment separation (dev, test, prod)

🛠️ How to Configure AWS S3 Remote Backend
Step 1: Create S3 Bucket (Outside Terraform)

Never create the state bucket using Terraform itself.

Enable:

  • Server-side encryption
  • Versioning
  • Block Public Access

Step 2: Add Backend Configuration

Create a backend.tf file:

# Configure the AWS Provider
terraform {
  required_providers {
    aws = {
      source = "hashicorp/aws"
      version = "~> 6.0"
    }
  }
}

provider "aws" {
  # Configuration options
    region = "us-east-1"
}

# backend configuration
terraform {
  backend "s3" {
    bucket         = "terraform-state-bucket-amit-123456789"
    key            = "dev/terraform.tfstate"
    region         = "us-east-1"
    use_lockfile  = "true"
    encrypt        = true
  }
}

🔎 What Each Parameter Means:

  • bucket → name of your S3 bucket
  • key → S3 path to your tfstate file
  • region → bucket region
  • encrypt → server-side encryption
  • use_locking → avoids simultaneous terraform apply

Step 3: Initialize Backend
Run:

terraform init


Terraform will migrate your local state into S3:

“Successfully configured the backend ‘s3’!”

This video from Piyush Sachdeva gives a clear and practical explanation of how Terraform manages its state file and why moving that state to an AWS S3 backend is important for real-world projects. He walks through the risks of keeping state locally, the benefits of using a remote backend, and the exact steps to set it up using S3.

🔗 Connect With Me

If you enjoyed this post or want to follow my #30DaysOfAWSTerraformChallenge journey, feel free to connect with me here:

💼 LinkedIn: Amit Kushwaha

🐙 GitHub: Amit Kushwaha

📝 Hashnode / Amit Kushwaha

🐦 Twitter/X: Amit Kushwaha

Why the Directory Is the Core of IAM: The Digital Heartbeat of Every Organization

2025-11-27 22:47:37

In a world where businesses run on SaaS, APIs, cloud apps, and hybrid environments, Identity and Access Management (IAM) has become one of the most foundational pillars of enterprise security. Everyone talks about MFA, SSO, Zero Trust, role-based access, and least privilege but surprisingly few talk about the real center of IAM:

The Directory.

Your directory isn't just an address book.
It's not just "Active Directory," "Okta Universal Directory," or "Entra ID."
It is and always has been the source of truth for identity across your entire digital ecosystem.

Think of IAM as a living organism:

  • Policies are the brain
  • Workflows are the muscles
  • Applications are the organs
  • Authentication is the pulse
  • And the Directory is the heart

Without the heart, nothing moves.
Nothing functions.
Nothing connects.
Let's explore why.

  1. The Directory Is the Source of Truth (SOT) for Identity

Every identity decision starts with a single question:
"Who is this user?"
The directory answers this consistently and authoritatively.
It provides:

  • User profiles
  • Attributes (department, title, location)
  • Group memberships
  • Security identifiers
  • Device trust status
  • Authentication factors
  • Role mappings

Every IAM tool : IGA, PAM, SSO, Zero Trust, RBAC, ABAC depends on the directory for accurate data.
If the directory is wrong…
everything downstream is wrong.

  • Wrong role → Wrong access
  • Wrong group → Wrong permissions
  • Wrong attributes → Wrong policies
  • Incomplete data → Incomplete governance The quality of your directory directly impacts the quality of your entire IAM program.
  1. The Directory Controls Access Everywhere

Every access decision ultimately checks directory data:

  • Logging into SaaS apps? → Directory
  • Authorizing API access? → Directory
  • Enforcing Zero Trust policies? → Directory
  • Assigning RBAC roles? → Directory
  • Auto-provisioning new hires? → Directory
  • Offboarding terminated users? → Directory

Even your "passwordless future" vision still depends on directory-backed identities.
The directory is literally the gatekeeper

  1. The Directory Reduces Security Risk at Scale

Most identity-related breaches come from:

  • Orphaned accounts
  • Duplicate identities
  • Inactive accounts
  • Over-permissioned groups
  • Unmanaged admin access
  • Stale user attributes

These are directory problems, not SSO or MFA problems.
A clean directory equals a secure organization.
A messy directory equals:

  • Access creep
  • God-mode permissions
  • Rogue admins
  • Shadow identities
  • Failed audits
  • Exposed SaaS data
  • Massive lateral movement

Simply improving directory hygiene reduces more risk than buying most security tools.

  1. The Directory Powers Automation

Modern IAM automation JML (Joiner-Mover-Leaver), lifecycle events, workflow triggers all run on directory data.
If your directory is aligned with HR and is updated in real-time, you get:

✔ Instant onboarding

New hires receive all required access automatically.

✔ Dynamic access

Role changes automatically adjust privileges

✔ Fast, complete offboarding

Access is revoked across every app and system.

✔ Zero manual tickets

No more "Please add Alice to this app" emails.

Automation is impossible without a high-quality directory.

  1. The Directory Connects Your Entire SaaS Ecosystem

Companies today don't use "a few apps."
They use hundreds sometimes thousands.
Your directory acts as the universal connector between:

  • HR → IAM
  • IAM → SSO
  • SSO → Apps
  • Apps → Roles
  • Roles → Policies
  • Policies → Access

Without a strong directory, your IAM ecosystem becomes fragmented:

  • Multiple identity stores
  • Inconsistent user data
  • Manual provisioning
  • Misaligned roles
  • Shadow IT everywhere
  • No governance or visibility

A unified directory removes friction across your digital organization.

  1. Directories Are Evolving Fast

Directories used to be simple.
Active Directory. On-prem. LDAP. A tree of OUs.
Now directories are becoming:

  • Cloud-native
  • API-first
  • Schema-flexible
  • Attribute-rich
  • Lifecycle-aware
  • Contextual (risk, device, behavior)
  • Global across SaaS ecosystems

Modern IAM platforms like Okta UD, Entra ID, JumpCloud, and cloud directories are becoming intelligent hubs not just identity repositories.
The future of IAM is built on top of this intelligence.

  1. Application Governance Still Depends on the Directory

Even new categories like Enterprise Application Governance (EAG) the space where AppGovern operates rely on directory data for:

  • Application ownership
  • Admin roles
  • License allocations
  • User-to-app mapping
  • Shadow IT detection
  • Risk scoring
  • Lifecycle management

The directory gives the identity context.
EAG adds the application context.
Together, they create a unified governance layer.
This partnership will define the next decade of IAM evolution.

  1. The Directory IS the IAM Program

If you want a high-performing IAM program, you don't start with SSO.

You don't start with IGA. You don't start with PAM.

You start with the directory.

  • Clean the directory → Clean the IAM program
  • Align the directory → Align access
  • Automate the directory → Automate IAM
  • Govern the directory → Govern applications

The directory is not just a component of IAM.
It is IAM.

Final Thoughts: The Directory Is the New Digital Identity Core

If you want:

  • Better security
  • Faster onboarding
  • Cleaner audits
  • Stronger Zero Trust
  • Reduced SaaS chaos
  • Lower access risk
  • Better application governance

Start with your directory.
It's the digital heart of your organization beating behind every login, every access decision, every workflow, and every application.
Fix the heart, and the whole IAM body becomes stronger.

How n8n Automates Contract Review and Approvals

2025-11-27 22:44:44

Legal teams handle more contracts than ever before. As your caseload grows, manual tasks start slowing down your entire operation. Simple activities like routing documents, gathering comments, tracking approvals, and keeping every version updated can take hours of unnecessary effort.

When your team relies on emails and spreadsheets, contracts move unevenly and delays become unavoidable. This is exactly why more legal teams now look for practical ways to reduce repetitive work and keep contract cycles predictable.

Contract automation helps you avoid these issues and gives your team more time to focus on actual legal work. n8n makes this possible by turning routine processes into smooth, rules based workflows that require very little manual involvement.

Challenges Legal Teams Face in Contract Review and Approval

Even experienced legal teams struggle with contract processes that involve several people, complex routing, and compliance expectations. If your team handles contracts every day, you may already face challenges like these.

  • Slow and inconsistent review cycles: Manual routing increases delays and makes each contract follow a different timeline.
  • Lack of visibility into contract status: It becomes difficult to know who has reviewed the document or where it sits in the approval chain.
  • Multiple reviewers and approval layers: High involvement often leads to backlogs and communication gaps.
  • Version control issues: Drafts arrive through different channels, which makes it easy to lose track of the latest version.
  • Compliance and audit risks: Manual handling increases the chance of skipped steps or missing documentation.
  • Increased operational costs: Inefficient workflows require more time, more coordination, and sometimes extra resources.

How n8n Automates Contract Review and Approval Step by Step

n8n gives legal teams a clear, consistent structure for every contract passing through the firm. Here is how each step becomes easier and more reliable.

Automated Contract Intake

The workflow begins the moment a contract enters your system. n8n connects with email, CRM tools, web forms, or client portals to detect new submissions instantly. It creates an internal record, stores essential information, and initiates the correct path without any manual sorting.

Intelligent Reviewer Assignment

Every contract follows its own rules based on type, value, or client category. n8n uses these rules to assign the contract to the right reviewer automatically. The assigned person receives instant notifications, which reduces waiting time and prevents unnecessary back and forth communication.

Version Control and Centralized Storage

n8n organizes all drafts in a single location such as Google Drive, SharePoint, or OneDrive. The workflow renames files, maintains history, and alerts reviewers when new versions arrive. This prevents outdated document edits and reduces confusion during collaboration.

Automated Review and Comment Workflow

Once reviewers receive the contract, n8n keeps the process on track. You can design sequential or parallel review steps, depending on your internal policies. The workflow sends reminders, follows up automatically, and triggers escalation if a reviewer takes too long to respond.

Approval Routing

After the review stage, n8n moves the document into the approval phase. It follows predefined rules to push the contract to the correct approvers. Your team can track progress in real time and rely on dashboards that stay accurate without manual updates.

E-Signature Integration with DocuSign or Adobe Sign

n8n connects with trusted e-signature tools to send documents for signing right after approval. It monitors signature progress and updates both your team and your clients when the process completes. This shortens turnaround time and eliminates the need for repeated reminders.

Final Archiving and System Updates

When the contract is signed, n8n stores it in the correct location automatically. The workflow also updates your CRM, ERP, or contract system to reflect the new status. Every action becomes part of a complete audit trail, which helps you maintain compliance during internal or external reviews.

Final Thoughts

Manual contract handling slows legal teams down and increases the risk of missed steps, outdated drafts, or compliance issues. Automating the review and approval process with n8n helps your firm reduce repetitive work, improve accuracy, and move each contract through a predictable path.

This approach scales smoothly without depending on more staff or constant support from automation developers. If your legal team wants consistent results, faster workflows, and cleaner documentation, now is the ideal time to adopt a modern automation strategy supported by reliable workflow automation services. A more structured process helps your team protect valuable time and focus on meaningful legal work that creates real impact for your clients.

Блогер нашёл способ воспроизводить на PS5 содержимое обычных компакт-дисков

2025-11-27 22:43:06

Автор YouTube-канала Will It Work нашёл способ воспроизводить на PlayStation 5 содержимое обычных компакт-дисков. Для этого нужен

Первые поколения PlayStation воспроизводили компакт-диски и служили пользователям универсальными медиацентрами. В PlayStation 4 поддержку Audio CD и Video CD убрали. Компания сделала ставку на стриминговые сервисы. Поколение PS5 тоже официально не воспроизводит Audio CD, но открывает различные форматы музыки с USB-накопителя.

Блогер выяснил, что если записать диск в формате Data CD и подключить его к консоли с помощью внешнего USB-дисковода, то система будет считать его USB-накопителем. Можно будет просматривать файлы и воспроизводить музыку с помощью встроенного медиаплеера

Introducing LeanSpec: A Lightweight SDD Framework Built from First Principles

2025-11-27 22:34:15

Earlier this year, I was amazed by agentic AI coding with Claude Sonnet 3.7. The term "vibe coding" hadn't been coined yet, but that's exactly what I was doing—letting AI generate code while I steered the conversation. It felt magical. Until it didn't.

After a few weeks, I noticed patterns: code redundancy creeping in, intentions drifting from my original vision, and increasing rework as the AI forgot context between sessions. The honeymoon was over. I needed structure, but not the heavyweight processes that would kill the speed I'd gained.

That search led me through several existing tools—Kiro, Spec Kit, OpenSpec—and eventually to building LeanSpec, a lightweight Spec-Driven Development framework that hits v0.2.7 today with 10 releases in under three weeks. This post shares why I built it, what makes it different, and how you can try it yourself.

The Problem: Vibe Coding's Hidden Costs

The Vibe Coding Trap
AI coding assistants are incredibly productive—until they're not. Without structured context, AI generates plausible but inconsistent code, leading to technical debt that compounds session after session.

If you've used AI coding tools extensively, you've likely encountered these patterns:

Symptom Root Cause Impact
Code redundancy AI doesn't remember previous implementations Duplicate logic scattered across files
Intention drift Context lost between sessions Features that don't quite match your vision
Increased rework No persistent source of truth Circular conversations explaining the same thing
Inconsistent architecture No structural guidance Components that don't fit together cleanly

The industry's answer has been Spec-Driven Development (SDD)—writing specifications before code to give AI (and humans) persistent context. But when I explored the existing tools, I found a gap.

Related Reading
New to SDD? Start with my foundational article Spec-Driven Development: A Systematic Approach to Complex Features for methodology basics, or dive into the 2025 SDD Tools Landscape for a comprehensive comparison of industrial tools. Want to try the methodology without installing anything? The Practice SDD Without the Toolkit tutorial has you covered.

Why I Built LeanSpec

My journey through the SDD landscape revealed three categories of tools, each with trade-offs that didn't fit my needs:

Vendor lock-in: Kiro (Amazon's SDD IDE) offers tight integration but requires abandoning my existing workflow. I like my tools—switching IDEs wasn't an option.

Cognitive overhead: Spec Kit provides comprehensive structure, but its elaborate format creates significant cognitive load. Even with AI-assisted writing, parsing and maintaining those specs demands mental bandwidth that feels excessive for solo and small-team work.

Missing project management: OpenSpec came closest to my ideal—lightweight and flexible—but lacked the project management capabilities I needed to track dozens of specs across multiple projects.

I wanted something different: a methodology, not just a tool. Something like Agile—a set of principles anyone can adopt, with lightweight tooling that gets out of the way.

So I built LeanSpec. And then I used LeanSpec to build LeanSpec.

First Principles: The Foundation

LeanSpec isn't just tooling—it's built on five first principles that guide every design decision:

Context Economy: Specs must fit in working memory—both human and AI. Target under 300 lines. If you can't read it in 10 minutes, it's too long.

Signal-to-Noise Maximization: Every line must inform decisions. No boilerplate, no filler, no ceremony for ceremony's sake.

Intent Over Implementation: Capture why, not just how. Implementation details change; intentions persist.

Bridge the Gap: Specs serve both humans and AI. If either can't understand it, the spec has failed.

Progressive Disclosure: Start simple, add structure only when pain is felt. No upfront complexity.

These principles aren't just documentation—LeanSpec's validate command enforces them automatically.

Key Features

Web UI for Visual Management

The feature I'm most excited about: lean-spec ui launches a full web interface for managing your specs visually.

# Launch the web UI
npx lean-spec ui

The UI provides Kanban-style board views, spec detail pages with Mermaid diagram rendering, and dependency visualization—all without leaving your browser. Perfect for planning sessions or reviewing project status.

LeanSpec Kanban Board View

LeanSpec Spec Detail View

First Principles Validation

LeanSpec doesn't just store specs—it validates them against first principles:

# Check your specs against first principles
lean-spec validate

# Output:
# specs/045-user-auth/README.md
#   ⚠️  warning  Spec exceeds 300 lines (342)  context-economy
#   ⚠️  warning  Missing overview section      structure
# 
# ✖ 2 warnings in 1 spec

This keeps specs lean and meaningful, preventing the specification bloat that plagues heavyweight SDD tools.

Smart Search & Project Management

Finding relevant specs shouldn't require remembering exact names:

# Semantic search across all specs
lean-spec search "authentication flow"

# Advanced queries
lean-spec search "status:in-progress tag:api"
lean-spec search "created:>2025-11-01"

The Kanban board gives you instant project visibility:

lean-spec board

# 📋 LeanSpec Board
# ─────────────────────────────────────
# 📅 Planned (12)     🚧 In Progress (3)     ✅ Complete (47)
# ─────────────────────────────────────

MCP Server for AI Integration

LeanSpec includes an MCP (Model Context Protocol) server, enabling AI assistants to directly interact with your specs:

{
  "mcpServers": {
    "leanspec": {
      "command": "npx",
      "args": ["@leanspec/mcp"]
    }
  }
}

Works with Claude Code, Cursor, GitHub Copilot, and other MCP-compatible tools. AI agents can search specs, read context, and update status—all programmatically.

Example Projects for Quick Start

New to SDD? Start with a working example:

# Scaffold a complete tutorial project
npx lean-spec init --example dark-theme

Three examples available: dark-theme, dashboard-widgets, and api-refactor—each demonstrating different SDD patterns.

The Journey: Building LeanSpec with LeanSpec

The most meta aspect of this project: after the initial release, LeanSpec has been developed entirely using LeanSpec.

Milestone Date Notes
First line of code Oct 23, 2025 Started with basic spec CRUD
v0.1.0 (First release) Nov 2, 2025 10 days from scratch to release
v0.2.0 (Production-ready) Nov 10, 2025 First principles validation, comprehensive CLI
v0.2.7 (Current) Nov 26, 2025 10 releases in 24 days

Over 120 specs have been created within LeanSpec itself—covering features, architecture decisions, reflections, and even marketing strategy. The feedback loop is tight: identify friction → write spec → implement → validate with real use.

I've also applied LeanSpec to other projects:

  • Crawlab (12k+ stars) — web crawler management platform
  • This blog (marvinzhang.dev)
  • Upcoming projects under the codervisor org

The pattern holds across all of them: specs provide context that survives between sessions, AI stays aligned with my intentions, and I spend less time re-explaining.

What Makes LeanSpec Different

If you've read my SDD Tools analysis, you know I evaluated six major tools in this space. Here's where LeanSpec fits:

Aspect Heavyweight Tools LeanSpec
Learning curve Days to weeks Minutes
Spec overhead Extensive upfront work Write as you go
Token cost Often >2,000 per spec <300 lines target
Flexibility Rigid structure Adapt to your workflow
Vendor lock-in Often required Works anywhere
Philosophy Tool-first Methodology-first

LeanSpec is "lean" in multiple senses:

  • Methodology: Like Agile, it's principles you can adopt regardless of tooling
  • Cognitive load: Low overhead, quick to learn
  • Token economy: Specs stay small, fitting in AI context windows
  • Flexibility: Adapt to your workflow, not the other way around

Getting Started

Try LeanSpec in under 5 minutes:

# Install globally
npm install -g lean-spec

# Initialize in your project
lean-spec init

# Create your first spec
lean-spec create user-authentication

# Launch the web UI
lean-spec ui

Or try an example project:

npx lean-spec init --example dark-theme

Already using Spec Kit or OpenSpec? Check out the migration guide—the transition is straightforward.

What's Next

LeanSpec is actively evolving. Current development focuses on:

  • VS Code extension for inline spec management (Spec 17)
  • AI chatbot UI for interactive spec assistance (Spec 94)
  • Comprehensive internationalization support (Spec 91)
  • GitHub multi-project ingration (Spec 98)

I built LeanSpec to solve my own problems—code quality degradation from vibe coding, context loss between AI sessions, the cognitive overhead of heavyweight SDD tools. If you face similar challenges, I hope it helps you too.

Links:

Questions, feedback, or feature requests? Open an issue or start a discussion. I read everything.