The six focus areas and 30 dimensions in the DealFax operational readiness framework are not proprietary inventions. They are drawn from established academic research on M&A transaction dynamics, practitioner literature from the IBBA and AM&AA, and publicly available datasets from the Bureau of Labor Statistics, U.S. Census Bureau, and OSHA. This page documents the evidence basis for each area of assessment.
A DealFax assessment draws from three independent sources. Each layer contributes something the other two cannot provide alone. The goal is a scored picture that is grounded in documented evidence and external context — not solely in what the owner reports about their own business.
When a business is engaged, DealFax identifies its NAICS code, firm size class (by employee count and revenue band), and primary geographic market. These three parameters are used to pull the relevant cut from each dataset — so a 28-person HVAC contractor in a Northern Virginia county gets benchmarks specific to specialty trade contractors of that size in that labor market, not a national HVAC industry average. This specificity is the core of what makes DealFax benchmarks actionable rather than generic. The full data stack — all 16 named sources, what each provides, and which dimensions each informs — is documented in the Data Stack section below →
The six focus areas reflect operational dimensions that are documented in M&A practitioner literature and academic research as factors that affect how transactions are structured and priced in owner-operated lower-middle market businesses. Each area also maps to at least one publicly available dataset that provides an independent benchmark.
The six focus areas are scored across 30 dimensions total — distributed according to where operational risk concentrates in owner-operated service businesses. The table below lists every dimension, its input type, and the primary evidence source. The full rubric — with anchor criteria and evidence caps per score level — is available on request.
| Dimension | Input Type | Primary Evidence Source |
|---|---|---|
| Area 01 — Owner Dependency · 7 dimensions | ||
| Decision authority documentation | Assessor-observed | Document review, assessor observation |
| Client relationship ownership | Assessor-observed | CRM data, client interview proxies |
| Vendor & supplier relationships | Assessor-observed | Document review, assessor observation |
| Pricing and quoting authority | Assessor-observed | SOP review, assessor observation |
| Revenue tied to owner presence | Calculated | Revenue analysis, assessor observation |
| Daily operational involvement | Assessor-observed | Assessor observation, document review |
| Contract signing authority | Document-verified | Document review, assessor observation |
| Area 02 — Revenue Quality · 6 dimensions | ||
| Customer concentration analysis | Threshold-applied | Revenue records, assessor analysis |
| Recurring vs. project revenue split | Document-verified | Revenue records |
| Contract documentation quality | Document-verified | Document review |
| Revenue growth vs. market benchmark | Benchmarked | Census SUSB, IRS SOI |
| Pricing consistency | Document-verified | Invoice review, assessor observation |
| Revenue seasonality | Benchmarked | Revenue records, assessor analysis |
| Area 03 — Operational Systems · 5 dimensions | ||
| SOP documentation completeness | Document-verified | Document review |
| Accounting software & cloud access | Assessor-observed | Document review, system verification |
| Field management system (FSM/CRM) | Assessor-observed | System access verification |
| Digital vs. paper records | Assessor-observed | Assessor observation |
| Job costing and margin visibility | Document-verified | Accounting system review |
| Area 04 — Workforce Stability · 5 dimensions | ||
| Annual turnover rate | Benchmarked | BLS JOLTS (NAICS-level) |
| Key person identification & exposure | Assessor-observed | Assessor observation, org review |
| Average employee tenure | Document-verified | Payroll data, assessor interview |
| Compensation documentation | Benchmarked | BLS OEWS / QCEW, document review |
| Cross-training depth | Assessor-observed | Assessor observation, document review |
| Area 05 — Customer Health · 4 dimensions | ||
| Google rating vs. market average | Benchmarked | Google Maps; Census CBP market area |
| Review volume & velocity trends | Benchmarked | Google Maps, BBB, Angi, Yelp |
| Customer retention rate | Benchmarked | CRM data, assessor interview |
| Lead source diversification | Assessor-observed | Assessor interview, CRM data |
| Area 06 — Succession & Mgmt. · 3 dimensions | ||
| Management depth below owner | Assessor-observed | Org chart, assessor interview |
| Documented transition plan | Document-verified | Document review |
| Operational knowledge documentation | Document-verified | Document review |
This is the canonical reference for the DealFax data enrichment and benchmarking process. Every Assessed Report draws from 16 named public sources across three categories — industry benchmarking, reputation signals, and compliance verification. The key principle: benchmarks are pulled for the specific NAICS code, firm size class, and geography of the business being assessed. Not national averages. Not generic sector data. The benchmark that applies to a 28-person HVAC contractor in a mid-size metro is different from the one that applies to a 5-person IT services firm in a rural county — and DealFax is built to make that distinction.
These datasets provide the external context that makes a business's own operational numbers meaningful. They are not records about the specific business — they are the relevant backdrop for the industry, size class, and geography the business operates in. Without this layer, a metric is just a number. Against the right benchmark, it's a signal.
| Dataset | Source | What It Provides | Primary Dimensions Informed |
|---|---|---|---|
| Job Openings & Labor Turnover Survey (JOLTS) | U.S. Bureau of Labor Statistics | Sector-level annual turnover, hire, and separation rates by NAICS code and region. Benchmarks employee attrition against the norm for the specific industry and geography. | D-06 Annual Turnover Rate, D-07 Key Person Exposure |
| Statistics of U.S. Businesses (SUSB) | U.S. Census Bureau | Revenue and employment growth rates by NAICS code and firm size class. Benchmarks revenue trajectory and headcount growth against comparable businesses in the same sector and size band. | D-14 Revenue Growth vs. Market Benchmark |
| County Business Patterns (CBP) | U.S. Census Bureau | Annual count of business establishments, employment, and wages by county and NAICS code. Provides local market density and competitive context. | D-14 Revenue Growth, D-26 Google Rating vs. Market Average |
| Occupational Employment & Wage Statistics (OEWS) | U.S. Bureau of Labor Statistics | Mean and median wages by occupation and geography at the metropolitan statistical area level. Assesses whether documented compensation is consistent with market rates for the role and location. | D-09 Compensation Documentation |
| Quarterly Census of Employment & Wages (QCEW) | U.S. Bureau of Labor Statistics | Quarterly employment levels, wages, and establishment counts by NAICS and county. More granular and current than OEWS — provides a county-level picture of employment trends for the specific geography. | D-06 Turnover Rate, D-09 Compensation Documentation |
| Annual Business Survey (ABS) | U.S. Census Bureau | Business characteristics, owner demographics, financing, and innovation activity by NAICS and firm size. Contextualizes ownership structure and business maturity against comparable firms. | General business profile benchmarking |
| Business Dynamics Statistics (BDS) | U.S. Census Bureau | Annual data on firm survival, entry rates, exit rates, job creation, and destruction by industry and size class. Directly benchmarks business longevity — a 22-year operating history means something different for a 5-person firm than a 50-person firm. | D-01 through D-05 (Owner Dependency context), general longevity benchmarking |
| Statistics of Income (SOI) — Business Returns | U.S. Internal Revenue Service | Industry-level financial ratios, profit margins, revenue per employee, and asset composition by NAICS and size class. Provides financial performance benchmarks beyond employment and wage data. | D-14 Revenue Quality, D-12 Recurring vs. Project Revenue context |
Reputation data provides an independently verifiable, buyer-facing signal about the health of the customer base that cannot be gamed or self-reported. Google is the primary benchmark source due to its volume and consistency across all industries and geographies. The others are used selectively based on the business's sector and coverage availability.
| Source | What It Provides | How It's Used | Dimensions Informed |
|---|---|---|---|
| Google Maps / Places | Star rating, review count, and review velocity for the business and comparable competitors in the same NAICS category and geographic area. | Primary reputation benchmark across all industries. The business's rating and review trajectory is benchmarked against the average for comparable businesses in the same category and local geography. | D-26 Google Rating vs. Market Average, D-27 Review Volume & Velocity |
| Better Business Bureau (BBB) | Accreditation status, complaint history, complaint resolution pattern, and BBB rating. | Trust and compliance signal. Used to identify unresolved complaint patterns or regulatory concerns not visible in financial records. Not used as a primary rating benchmark. | D-27 Review Velocity (supplementary), general compliance signal |
| Angi / HomeAdvisor | Rating, review count, and pro status for home services trade businesses. | Secondary reputation benchmark for home services trades — HVAC, plumbing, electrical, general contracting. Used where Angi coverage is meaningful and adds context beyond Google alone. | D-26 Google Rating vs. Market Average (supplementary for trades) |
| Yelp | Star rating and review volume for businesses with active Yelp presence. | Supplementary reputation signal for professional services, food and hospitality, and health services where Yelp coverage is stronger. Not used as a primary benchmark for trades businesses where coverage is thin. | D-26, D-27 (supplementary where coverage is sufficient) |
These are records about the specific business — not benchmarks or industry context. Each is checked independently against the business's legal name, license numbers, or employer identification. The applicable state licensing databases are identified as part of scope definition and pulled for wherever the business operates — this is a national process, not a Virginia or Maryland-specific one.
| Database | Source | What It Verifies | Dimensions Informed |
|---|---|---|---|
| OSHA Establishment-Specific Injury & Illness Data | U.S. Department of Labor — OSHA | Reported workplace injuries, illness rates, and citations for the specific establishment. Identifies any serious or repeat violations on the public record. | D-19 Digital vs. Paper Records (regulatory file), D-07 Key Person Exposure |
| State Licensing & Regulatory Databases | Applicable state licensing authority — identified per engagement based on where the business operates and what services it provides | Contractor, tradesperson, and professional license status, expiration date, and disciplinary history. The relevant database varies by state and occupation — DealFax identifies and pulls the applicable database for each engagement. | D-07 Key Person Exposure, D-21 Management Depth (licensed successor) |
| Secretary of State / Business Entity Records | Applicable state Secretary of State office | Business entity status (active, dissolved, in good standing), registered agent, and annual filing history. Confirms the business is legally current in its state of operation. | General compliance verification, D-16 SOP Documentation context |
| DOL Wage & Hour Division — Enforcement Data | U.S. Department of Labor — Wage and Hour Division | Public record of wage and hour violations, back wage findings, and civil money penalties by employer. Surfaces any compensation compliance exposure not visible from financial statements. | D-09 Compensation Documentation, general compliance signal |
The TRUST Index is a five-dimension assessor-observed evaluation of the seller's own readiness to drive a successful transition — independent of the business's operational score. It addresses a gap in standard operational due diligence: a business can be operationally ready while the seller is not, and vice versa.
The TRUST Index is not self-reported. Scores are assigned by the assessor based on observed behavior, document evidence, and interview responses. Three design principles prevent gaming. First, each dimension must be assessor-observed — the owner cannot self-report a high score. Second, documentary evidence is required to reach the upper score levels — without it, the score is capped at a defined ceiling regardless of what the owner says. Third, the most critical dimensions are cross-referenced against independently verifiable sources, so discrepancies between what the owner represents and what the record shows are surfaced during the assessment.
| Dimension | What It Measures | Evidence Required for High Score |
|---|---|---|
| Transparency & Responsiveness | Seller's cooperation with the assessment process — document provision, response completeness, and accuracy of information provided | Timely document delivery; no material omissions identified in cross-reference against public records |
| Knowledge Transfer Readiness | Extent to which institutional knowledge has been documented or delegated prior to the assessment | SOPs, org chart, documented client introduction plan; evidence of delegation in place |
| Commitment to Exit | Clarity and conviction of the seller's exit timeline and readiness to execute a transition within a defined window | Documented exit timeline; no evidence of ambivalence or undisclosed contingencies that would affect timing |
| Relationship Introduction Willingness | Whether key client and vendor relationships have been introduced to a successor or documented for transfer | Evidence of relationship introduction; documented contacts independent of owner's personal network |
| Post-Close Transition Support | Seller's demonstrated willingness and practical ability to support post-close operations during the transition period | Documented availability commitment, defined consulting scope, timeline clarity, no conflicts with other obligations |
The TRUST Index classification scale: 1.0–2.0 = Unprepared Seller; 2.0–3.0 = Engaged but Unprepared; 3.0–4.0 = Transition-Ready Seller; 4.0–5.0 = Optimally Positioned.
The complete scoring rubric, evidence cap criteria, and TRUST Index assessor guide are available to brokers, academic researchers, and practitioners on request.
Request Methodology Document