Table of Contents
1 Purpose & Scope
This Data Classification Policy ("Policy") establishes a framework for classifying all data created, collected, processed, stored, or transmitted by Find The Breach LLC ("FindTheBreach") based on sensitivity and criticality. Proper classification ensures that data receives the appropriate level of protection throughout its lifecycle. This Policy is aligned with SOC 2 CC1 (Control Environment), CC6 (Logical and Physical Access Controls), ISO/IEC 27001:2022 Annex A.5.12–A.5.14, and NIST SP 800-60.
Scope. This Policy applies to:
- All data in any form (electronic, physical, verbal) regardless of storage medium or location
- All employees, contractors, consultants, and third-party service providers who create, access, process, or manage FindTheBreach data
- All systems and environments including the SaaS platform, Docker containers, PostgreSQL databases, scanner tool integrations, APIs, backup systems, and development environments
- Customer data including vulnerability scan results, target configurations, account credentials, and billing information
Data must be classified at the time of creation or acquisition. Unclassified data must be treated as Confidential until formally classified by the data owner.
2 Classification Levels
All data is classified into one of four levels based on the potential business impact of unauthorized disclosure, modification, or loss:
| Level | Definition | Impact if Compromised | Examples at FindTheBreach |
|---|---|---|---|
| Public | Information explicitly approved for public distribution. No confidentiality requirements. | Negligible — no business, legal, or reputational harm | Marketing materials, public website content, published blog posts, press releases, open-source license disclosures, publicly available API documentation |
| Internal | Information intended for internal use by authorized FindTheBreach personnel. Not intended for public distribution. | Low — minor operational disruption or embarrassment | Internal policies and procedures, internal meeting notes, organizational charts, non-sensitive project plans, internal training materials, aggregated (non-identifying) platform metrics |
| Confidential | Sensitive information that could cause significant harm to the company, customers, or partners if disclosed. Access limited to authorized individuals with a business need. | Moderate to High — financial loss, regulatory penalties, reputational damage, contractual breach | Customer vulnerability scan results, target host configurations, customer account data (email, company name), API keys, webhook URLs, billing records, scan authorization documents, internal security audit reports, scanner tool configurations |
| Restricted | Highly sensitive information subject to the strictest controls. Unauthorized access could cause severe or irreversible harm. Access limited to named individuals with explicit authorization. | Severe — major financial loss, legal liability, criminal exposure, critical reputational damage | AES-256-GCM encryption keys, database master credentials, TOTP/MFA secret seeds, customer passwords (bcrypt hashes), production database backups, penetration test raw findings with exploit proof-of-concept, incident response forensic data, BAA-protected health data |
3 Handling Requirements
The minimum security controls for each classification level are defined below. Higher classification levels inherit all requirements of lower levels.
| Control | Public | Internal | Confidential | Restricted |
|---|---|---|---|---|
| Encryption at Rest | — | — | AES-256-GCM | AES-256-GCM + HSM key mgmt |
| Encryption in Transit | — | TLS 1.2+ | TLS 1.3 | TLS 1.3 + certificate pinning |
| Access Control | None | Authenticated user | RBAC + need-to-know | Named individuals + MFA/TOTP |
| Audit Logging | — | Basic | Full audit trail | Full + real-time alerting |
| Sharing | Unrestricted | Internal only | Authorized + NDA/DPA | Named recipients + CISO approval |
| Backup | — | Standard | Encrypted + geo-separated | Encrypted + geo-separated + integrity verification |
4 Labeling Standards
Data must be labeled with its classification level to ensure that handlers understand the required protection measures. Labeling methods vary by medium:
- Electronic Documents: Classification label must appear in the document header, footer, or metadata (e.g., "CONFIDENTIAL — FindTheBreach"). For automated systems, classification is embedded in file metadata or database column annotations.
- Emails: The classification level must be included in the subject line prefix (e.g., "[RESTRICTED]") for Confidential and Restricted data. Internal email may omit the prefix.
- Database Records: PostgreSQL tables containing Confidential or Restricted data are tagged in the data catalog with their classification level. Column-level classification is maintained for tables containing mixed-classification data (e.g., customer account tables).
- API Responses: API endpoints serving Confidential or Restricted data include the
X-Data-Classificationresponse header. API documentation identifies the classification level of each endpoint's response data. - Docker Images & Containers: Container images are labeled with OCI annotations indicating whether they process Restricted data. Containers handling Restricted data run in isolated network segments.
- Physical Media: Physical media (if any) must be visibly labeled with the classification level. Restricted media must be stored in locked containers with access logs.
Public data does not require classification labeling, though labeling as "PUBLIC" is encouraged for clarity.
5 Storage Requirements
Data storage controls must match or exceed the requirements of the data's classification level.
- Public & Internal: May be stored on standard company-managed systems. Internal data must not be stored on personal devices or unauthorized cloud services.
- Confidential: Must be stored in encrypted form using AES-256-GCM. PostgreSQL databases use Transparent Data Encryption (TDE) or application-level encryption for sensitive columns (e.g., scan results, customer credentials). Docker volumes containing Confidential data use encrypted storage drivers. Backups are encrypted and stored in geographically separated locations.
- Restricted: All Confidential storage requirements apply, plus: encryption keys are managed through a hardware security module (HSM) or equivalent key management service. Access to storage systems requires MFA/TOTP. Storage of Restricted data on portable devices or removable media is prohibited without explicit CISO approval. Database access is restricted to named service accounts with individually audited credentials.
Data Residency. Customer scan data is stored in the region specified in the customer's service agreement. Cross-border data transfers comply with applicable regulations (GDPR Chapter V, Standard Contractual Clauses).
6 Transmission Requirements
Data in transit must be protected according to its classification level to prevent interception, tampering, or unauthorized disclosure.
- Public: No encryption requirements for transmission, though TLS is preferred for all web traffic.
- Internal: Must be transmitted over encrypted channels (TLS 1.2 or higher). Internal API-to-API communication within the Docker network uses mTLS where feasible.
- Confidential: Must be transmitted exclusively over TLS 1.3. Email transmission of Confidential data requires S/MIME encryption or an approved secure file transfer mechanism. API transmissions include integrity verification (HMAC signatures). Webhook payloads containing scan results are signed and encrypted.
- Restricted: All Confidential transmission requirements apply, plus: transmission is limited to approved, pre-authorized channels. Certificate pinning is required for Restricted API integrations. File transfers require end-to-end encryption with recipient verification. Transmission of Restricted data via email is prohibited; approved secure transfer mechanisms must be used.
Scanner Data Transmission. Data flowing between the 35+ scanner tools and the FindTheBreach platform is treated as Confidential at minimum. Scanner results are encrypted in transit from the scanner container to the aggregation layer and at rest in the PostgreSQL results database.
7 Disposal & Destruction
Data must be disposed of securely when it is no longer needed for business, legal, or regulatory purposes. Disposal methods must render data unrecoverable.
- Public & Internal: Standard deletion is sufficient. No special destruction procedures are required.
- Confidential: Electronic data must be securely deleted using cryptographic erasure (destroying the encryption key) or overwriting (NIST SP 800-88 guidelines). PostgreSQL records are purged using
DELETEfollowed byVACUUMto reclaim storage. Docker volumes and container images containing Confidential data are destroyed upon decommissioning. Backup media follows the same destruction standards. - Restricted: All Confidential disposal requirements apply, plus: destruction must be witnessed and documented with a certificate of destruction. Cryptographic keys associated with Restricted data are destroyed using NIST-approved methods. Physical media is shredded (NIST SP 800-88, Purge level). Disposal records are retained for seven (7) years.
Customer Data Disposal. Customer scan data is deleted within 30 days of account termination or upon customer request, consistent with our Privacy Policy and applicable data retention requirements.
8 Roles & Responsibilities
The following roles are defined to ensure clear accountability for data classification and protection:
| Role | Responsibilities |
|---|---|
| Data Owner | A senior-level individual (typically a department head or product lead) accountable for the data within their domain. Responsibilities: assign the classification level, approve access requests, define retention periods, ensure compliance with this Policy, and review classifications annually. Data owners are designated in the data inventory. |
| Data Custodian | Technical personnel (typically Engineering or DevOps) responsible for implementing and maintaining the security controls required by the classification level. Responsibilities: enforce encryption standards (AES-256-GCM), manage database access controls, configure Docker container isolation, implement backup and disposal procedures, and report control failures to the data owner and CISO. |
| Data User | Any individual authorized to access, process, or handle data in the course of their duties. Responsibilities: handle data in accordance with its classification level, report suspected misclassification or policy violations, complete data handling training, and never share data beyond their authorization. |
| CISO (Policy Owner) | Overall accountability for this Policy, the classification framework, exception management, and compliance monitoring. Approves all Restricted data access requests and classification exceptions. Reports on classification compliance to executive management quarterly. |
9 Re-Classification Process
Data classifications are not permanent. As business requirements, regulatory obligations, or data sensitivity change, data may need to be re-classified to a higher or lower level.
- Triggers for Re-Classification: Changes in regulatory requirements (e.g., new HIPAA applicability), changes in contractual obligations, data aggregation that increases sensitivity, passage of time reducing sensitivity (e.g., published vulnerability data), customer request, or discovery of misclassification.
- Upgrade (Lower → Higher): Any data user who identifies that data may be under-classified must report it to the data owner immediately. Upgraded data must have enhanced controls applied within 24 hours. Upgraded data is treated at the higher classification level immediately upon identification, prior to formal approval.
- Downgrade (Higher → Lower): Downgrade requests must be submitted in writing by the data owner to the CISO. The request must include justification, risk assessment, and confirmation that all regulatory and contractual requirements are met at the proposed lower level. CISO approval is required before any controls are relaxed.
- Annual Review: Data owners must review the classification of all data assets within their domain at least annually. The CISO coordinates the annual classification review and reports results to executive management.
- Documentation: All re-classification actions are documented in the data inventory with the previous level, new level, justification, approver, and effective date. Re-classification records are retained for the lifetime of the data plus three (3) years.
10 Policy Review & Maintenance
This Policy is reviewed at least annually by the CISO and approved by executive management. Ad-hoc reviews are triggered by regulatory changes, significant security incidents involving data misclassification, audit findings, or material changes to the data landscape.
- Version Control: This Policy is maintained under version control. All revisions include a change summary, effective date, and approval record.
- Training: All personnel receive data classification training during onboarding and annually thereafter. Role-specific training is provided for data owners and data custodians. Training completion is tracked and reported.
- Compliance Monitoring: The CISO conducts quarterly spot checks to verify that data is classified and handled in accordance with this Policy. Non-compliance findings are tracked in the risk register and remediated within 30 days.
- Exceptions: Exceptions to this Policy require written justification, a compensating controls plan, a defined expiration date (maximum 12 months), and CISO approval. Exceptions are reviewed quarterly and expire automatically unless renewed.
Policy Contact
Chief Information Security Officer (CISO), Find The Breach LLC
Email: security@findthebreach.com