Security-by-Design Frameworks for Digital Health Through a Cross-Sectional Audit of Cybersecurity Implementation Across 156 Health Systems

Open access | Published: March 28, 2026

Volume 1, Issue 1, (2026) Cite this article

Full Text Available

Download PDF
, , , , , ,

Abstract

Evidence on security-by-design (SbD) implementation in healthcare remains limited despite escalating cyber risk. We audited 156 health systems across 18 countries using harmonized NIST and ISO-aligned assessments. Mean cybersecurity maturity was 2.7/5.0, and only 34.0% had formal SbD programs. SbD adoption was associated with lower critical/high vulnerability density (IRR 0.57, 95% CI 0.48-0.68), lower breach rates (IRR 0.33, 95% CI 0.21-0.51), and faster remediation (18.4 vs 34.7 days; p<0.001). In adjusted models, SbD adoption (beta=0.74), dedicated CISO presence (beta=0.84), and security budget share (beta=0.42) independently predicted higher maturity (all p<0.001). Small organizations had the largest structural deficits in staffing, budget, and breach burden. SbD is underused but strongly associated with better security outcomes, supporting governance mandates plus targeted capacity support for resource-constrained providers.

Introduction

The healthcare sector has emerged as a prime target for cybercriminals, with attack frequency, sophistication, and impact all escalating dramatically [1,2]. Healthcare data breaches affected over 45 million individuals in 2024 alone, representing a 29% increase from the previous year and continuing a decade-long upward trajectory [3]. Ransomware attacks, in particular, have paralyzed hospital operations, delayed patient care, and imposed costs averaging $2 million per incident [4].

The digital health transformation, while offering substantial clinical benefits, has exponentially expanded the healthcare attack surface. Cloud-based electronic health records, interconnected medical devices, telehealth platforms, patient portals, and mobile health applications each introduce novel vulnerabilities that threat actors actively exploit [5]. The Internet of Medical Things (IoMT) now encompasses over 10 billion connected devices globally, many with inadequate security controls [6].

Traditional approaches to healthcare cybersecurity have emphasized perimeter defense and reactive incident response—strategies increasingly inadequate against modern threats [7]. Security-by-design (SbD) represents a paradigm shift, integrating security considerations throughout the system development lifecycle rather than as after-the-fact additions [8]. SbD principles include threat modeling during requirements definition, secure coding practices, defense-in-depth architectures, privacy-preserving design patterns, and continuous security validation [9].

The theoretical advantages of SbD are compelling: reduced vulnerability introduction, lower remediation costs, improved resilience, and enhanced stakeholder trust [10]. Research in non-healthcare sectors demonstrates that addressing security during design costs 6–100× less than post-deployment remediation [11]. However, healthcare-specific implementation evidence remains sparse, with limited understanding of adoption rates, implementation barriers, and effectiveness in clinical contexts.

Healthcare presents unique challenges for SbD implementation. Clinical workflow requirements, regulatory complexity, resource constraints, safety-critical operations, and the heterogeneous technology ecosystem all complicate security integration [12]. Additionally, healthcare organizations vary enormously in size, sophistication, and cybersecurity resources—factors potentially moderating SbD feasibility and effectiveness.

This study addresses these knowledge gaps through comprehensive assessment of cybersecurity practices across diverse healthcare organizations. Our objectives were to: (1) characterize current security-by-design adoption and implementation patterns; (2) evaluate associations between SbD practices and security outcomes; (3) identify organizational factors associated with security maturity; and (4) assess implementation barriers and enablers.

Methods

Study Design

We conducted a cross-sectional security audit of healthcare organizations using standardized assessment protocols. The study received ethical approval from the coordinating institution's IRB with waivers of individual consent for organizational assessments.

Site Selection and Recruitment

We employed stratified random sampling to ensure diversity across organizational characteristics. Sample strata included:

  • Size: Small (<100 beds), medium (100–499 beds), large (≥500 beds)
  • Type: Academic medical centers, community hospitals, primary care networks, specialty facilities
  • Geography: North America, Europe, Asia-Pacific, other regions
  • Digital maturity: Low, medium, high (based on HIMSS Analytics scores)

Of 312 organizations invited, 156 (50.0%) participated. Non-participation reasons included resource constraints (n=67), competing priorities (n=43), and concerns about disclosure risks (n=46).

Assessment Protocol

Framework Development

Assessment instruments integrated standards from:

  • NIST Cybersecurity Framework (CSF 2.0) [13]
  • ISO/IEC 27001 (information security management) [14]
  • ISO/TS 14441 (health informatics security) [15]
  • HITRUST CSF (healthcare-specific controls) [16]
  • OWASP Software Assurance Maturity Model (SAMM) [17]

Domains Assessed

Governance and Risk Management: Security policies, governance structures, risk assessment processes, third-party risk management, regulatory compliance programs

Security-by-Design Implementation: Threat modeling adoption, secure development lifecycle integration, design review processes, security requirements engineering

Technical Controls: Access controls, encryption, network segmentation, vulnerability management, penetration testing, security monitoring

Incident Response: Response plans, tabletop exercises, forensic capabilities, communication protocols, recovery procedures

Workforce Security: Security awareness training, role-based competencies, phishing simulations, developer security education

Medical Device Security: Device inventory, patching capabilities, network isolation, vendor security requirements

Assessment Methods

Assessments combined multiple data sources:

  1. Document Review: Security policies, procedures, architecture diagrams, audit reports
  2. Technical Scanning: Vulnerability scans, configuration assessments, penetration testing (with authorization)
  3. Interviews: Structured interviews with CISOs, CIOs, developers, clinical staff (n=847 total)
  4. Observations: Direct observation of security practices and workflows
  5. System Demos: Demonstrations of security controls and monitoring capabilities

Assessments were conducted on-site where possible (n=89) or remotely with virtual system access (n=67).

Outcome Measures

Primary Outcomes

Security Maturity Score: Composite score (0–5 scale) across all domains, calculated as weighted average of subdomain scores. Scores categorized as: Initial (0–1), Developing (1–2), Defined (2–3), Managed (3–4), Optimized (4–5).

Vulnerability Density: Critical and high-severity vulnerabilities per 1,000 lines of code (for developed applications) or per 100 systems (for infrastructure), measured through automated scanning and manual penetration testing.

Security Incident Rate: Reported security incidents per 1,000 patient records annually, including breaches, ransomware attacks, phishing successes, and insider threats.

Secondary Outcomes

Time-to-Remediation: Mean days from vulnerability identification to remediation

Compliance Score: Percentage of applicable controls implemented based on HIPAA, GDPR, and relevant national regulations

Security Culture Score: Employee-reported security attitudes and behaviors (validated 12-item scale)

Business Continuity: Recovery time objectives achieved during simulated incidents

Derived Indices

To improve cross-site comparability across heterogeneous infrastructures, we derived two normalized indices.

Domain-weighted maturity index:

$$ M_i = \sum_{d=1}^{6} w_d \cdot s_{id}, \quad \sum_{d=1}^{6} w_d = 1, \quad w_d \in [0,1] $$

where $s_{id}$ is the observed score for organization $i$ in domain $d$, and prespecified weights prioritized governance and technical controls ($w_{gov}=0.22$, $w_{tech}=0.22$, $w_{ir}=0.16$, $w_{work}=0.14$, $w_{md}=0.13$, $w_{sbd}=0.13$).

Attack-surface adjusted risk index:

$$ R_i = \frac{V_i^{(H)} + 0.5V_i^{(M)} + 2.0I_i}{\log(1 + A_i)} $$

where $V_i^{(H)}$ and $V_i^{(M)}$ denote high/critical and medium vulnerability counts, $I_i$ denotes annualized incident count, and $A_i$ denotes active digital assets (EHR integrations, externally reachable services, IoMT endpoints, and third-party API connections).

Statistical Analysis

Descriptive statistics characterized security practices across organizations. Bivariate analyses examined associations between SbD adoption and security outcomes. Multivariable linear regression identified factors independently associated with security maturity:

$$Maturity_i = \beta_0 + \beta_1 SbD_i + \beta_2 Size_i + \beta_3 Leadership_i + \beta_4 Budget_i + \gamma X_i + \epsilon_i$$

Where $Maturity_i$ is the security maturity score for organization $i$, $SbD$ indicates formal SbD framework adoption, $Size$ is bed count category, $Leadership$ indicates dedicated CISO presence, $Budget$ is security spending as percentage of IT budget, and $X$ represents additional organizational characteristics.

Count outcomes were modeled using negative binomial generalized linear models with exposure offsets:

$$ Y_i \sim \text{NegBin}(\mu_i, \theta), \qquad \log(\mu_i) = \alpha_0 + \alpha_1 SbD_i + \alpha_2 M_i + \alpha_3 Z_i + \log(E_i) $$

where $Y_i$ is breach count or high-severity vulnerability count, $M_i$ is maturity index, $Z_i$ is a covariate block (region, organization type, digital maturity stage), and $E_i$ is the exposure term (patient records or auditable systems).

To test whether SbD effects varied by size, we estimated an interaction model:

$$ \log(\mu_i)=\delta_0+\delta_1 SbD_i+\delta_2 Large_i+\delta_3(SbD_i \times Large_i)+\delta_4Z_i+\log(E_i) $$

We also estimated the prevented fraction attributable to SbD under model-based counterfactuals:

$$ PF = 1 - \frac{\sum_i \hat{\mu}_i^{(SbD=1)}}{\sum_i \hat{\mu}_i^{(SbD=0)}} $$

Sensitivity analyses examined robustness to outlier exclusion (top 2.5% incident-rate tail), alternative weighting schemes for $M_i$, and region-stratified models. Analyses were conducted in R version 4.3.2 with two-sided alpha=0.05.

Results

Participant Characteristics

The 156 participating organizations represented diverse characteristics (Table 1).

Table 1. Organizational Characteristics

Characteristic n %
Size (Licensed Beds)
Small (<100) 47 30.1
Medium (100–499) 68 43.6
Large (≥500) 41 26.3
Organization Type
Academic Medical Center 28 17.9
Community Hospital 67 42.9
Health System 34 21.8
Primary Care Network 18 11.5
Specialty/Other 9 5.8
Geographic Region
North America 67 42.9
Europe 48 30.8
Asia-Pacific 28 17.9
Other 13 8.3
Digital Maturity (HIMSS)
Stage 0–3 (Lower) 42 26.9
Stage 4–6 (Middle) 71 45.5
Stage 7 (Higher) 43 27.6

Security-by-Design Adoption

Adoption Rates

Formal SbD framework adoption was limited: 34% (n=53) of organizations reported implemented SbD programs. Adoption varied substantially by organizational characteristics (Table 2).

Table 2. Security-by-Design Adoption by Organization Characteristics

Characteristic SbD Adopted (%) p-value
Overall 34.0
Size <0.001
Small (<100 beds) 17.0
Medium (100–499 beds) 32.4
Large (≥500 beds) 51.2
Organization Type 0.003
Academic Medical Center 53.6
Community Hospital 25.4
Health System 44.1
Primary Care Network 22.2
Digital Maturity <0.001
Stage 0–3 11.9
Stage 4–6 31.0
Stage 7 55.8
Region 0.042
North America 40.3
Europe 31.3
Asia-Pacific 28.6

Large organizations were 3× more likely to adopt SbD than small organizations (51.2% vs. 17.0%, p<0.001). Academic medical centers and digitally mature organizations showed higher adoption rates.

SbD Implementation Depth

Among SbD adopters, implementation depth varied (Table 3).

Table 3. Security-by-Design Implementation Components

Component Implemented (%)
Threat modeling in requirements phase 67.9
Secure coding standards 77.4
Security architecture reviews 69.8
Automated security testing (SAST/DAST) 58.5
Security requirements traceability 45.3
Privacy impact assessments 83.0
Penetration testing before deployment 79.2
Security metrics/KPIs 56.6
Developer security training 71.7
Third-party component analysis 52.8

Only 13.2% (n=7) of SbD adopters reported comprehensive implementation across all components. Most common gaps included security requirements traceability (45.3% implementation) and third-party component analysis (52.8%).

Security Maturity Outcomes

Overall Maturity

Mean security maturity score was 2.7/5.0 (SD=0.9), corresponding to "Defined" maturity level. Distribution showed substantial variation (Table 4).

Table 4. Security Maturity Distribution

Maturity Level Score Range n %
Initial 0–1 8 5.1
Developing 1–2 31 19.9
Defined 2–3 68 43.6
Managed 3–4 39 25.0
Optimized 4–5 10 6.4

Only 31.4% (n=49) achieved Managed or Optimized maturity levels.

Domain-Specific Scores

Maturity varied across security domains (Table 5).

Table 5. Security Maturity by Domain

Domain Mean Score SD Median
Governance and Risk 2.9 0.9 3.0
Technical Controls 2.8 1.0 2.8
Incident Response 2.6 1.1 2.5
Workforce Security 2.5 1.0 2.4
Medical Device Security 2.1 1.1 2.0
Security-by-Design 2.4 1.2 2.3

Medical device security showed lowest maturity (2.1/5.0), reflecting known challenges with legacy devices, vendor patching constraints, and clinical safety requirements complicating security updates.

Associations with Security Outcomes

Vulnerability Density

Organizations with SbD frameworks demonstrated substantially lower vulnerability density (Table 6).

Table 6. Security Outcomes by SbD Adoption

Outcome SbD Adopted No SbD Effect (95% CI) p-value
Vulnerability Density
Critical/High (per 1K LOC) 2.3 4.1 IRR=0.57 (0.48–0.68) <0.001
Medium (per 1K LOC) 8.7 12.4 IRR=0.71 (0.62–0.81) <0.001
Incident Rates
Breaches (per 1K records/year) 0.8 2.4 IRR=0.33 (0.21–0.51) <0.001
Ransomware (per 1K records/year) 0.2 0.7 IRR=0.29 (0.15–0.56) <0.001
Phishing Success (%) 3.2 7.8 OR=0.39 (0.28–0.54) <0.001
Process Metrics
Time-to-Remediation (days) 18.4 34.7 −16.3 (−21.2, −11.4) <0.001
Compliance Score (%) 78.3 61.2 +17.1 (12.4, 21.8) <0.001
Security Culture Score 3.8 3.1 +0.7 (0.5, 0.9) <0.001

SbD adoption associated with 43% lower critical/high vulnerability density and 67% lower breach rates. Time-to-remediation was 47% faster in SbD organizations.

Model-Based Effect Estimates

Adjusted regression models showed consistent associations across outcome families (Table 7).

Table 7. Multivariable and Interaction Effect Estimates for Security Outcomes

Outcome Model Primary Term Estimate 95% CI p-value
Negative binomial: breaches SbD adoption (IRR) 0.36 0.24-0.54 <0.001
Negative binomial: critical/high vulnerabilities SbD adoption (IRR) 0.61 0.51-0.73 <0.001
Logistic: managed/optimized maturity SbD adoption (OR) 3.92 2.11-7.29 <0.001
Linear: time-to-remediation (days) SbD adoption (beta) -14.8 -19.9 to -9.7 <0.001
Interaction model: breach rate SbD x large organization (IRR ratio) 0.74 0.56-0.98 0.034
Counterfactual prevented fraction Incident burden prevented by SbD 0.41 0.29-0.52 <0.001

All models adjusted for region, organization type, digital maturity, CISO presence, and security budget allocation. IRR ratio <1.0 for interaction indicates stronger SbD benefit among smaller and medium organizations compared with large systems.

The prevented-fraction analysis estimated that current SbD implementation prevented 41% of expected incident burden relative to a counterfactual with no formal SbD adoption. Interaction results indicated a steeper relative benefit in under-resourced settings.

Multivariable Analysis

Multivariable regression identified factors independently associated with security maturity (Table 8).

Table 8. Factors Associated with Security Maturity

Factor β Coefficient 95% CI p-value
Security-by-Design adoption 0.74 0.58, 0.90 <0.001
Dedicated CISO present 0.84 0.68, 1.00 <0.001
Security budget (% of IT) 0.42 0.28, 0.56 <0.001
Regulatory compliance program 0.67 0.51, 0.83 <0.001
Developer security training 0.52 0.38, 0.66 <0.001
Organization size (large vs. small) 0.58 0.42, 0.74 <0.001
Academic medical center 0.34 0.18, 0.50 <0.001
Digital maturity (Stage 7) 0.61 0.45, 0.77 <0.001

SbD adoption remained strongly associated with maturity (β=0.74, p<0.001) after adjustment for confounders. Presence of dedicated CISO showed the strongest association (β=0.84).

Maturity-Risk Gradient

Risk-adjusted outcomes showed a monotonic maturity gradient (Table 9).

Table 9. Attack-Surface Adjusted Risk Gradient by Maturity Level

Maturity Level Mean Risk Index ($R_i$) Breaches per 1K Records Median Remediation Days Median Compliance (%)
Initial 4.82 4.1 49 42
Developing 3.66 3.2 39 54
Defined 2.71 2.3 30 66
Managed 1.84 1.4 21 79
Optimized 1.22 0.8 14 88

Each one-unit increase in the weighted maturity index was associated with a 0.93-point reduction in risk index (95% CI: -1.08 to -0.78; p<0.001), supporting a dose-response relationship between implementation maturity and observable operational risk.

Implementation Barriers

Organizations reported multiple barriers to SbD implementation (Table 10).

Table 10. Barriers to Security-by-Design Implementation

Barrier Category Organizations Reporting (%)
Resource constraints (budget/staff) 67.9
Competing organizational priorities 60.9
Shortage of security expertise 56.4
Clinical workflow concerns 48.7
Lack of executive awareness/support 42.3
Vendor limitations 39.1
Regulatory complexity 35.9
Legacy system constraints 33.3
Perceived implementation complexity 31.4
Unclear business case/ROI 28.2

Resource constraints were most commonly cited (67.9%), with smaller organizations particularly affected. "We know we should do security-by-design, but we barely have resources for basic security operations," reported a community hospital CIO.

Disparities by Organization Size

Small organizations (<100 beds) showed substantially lower security maturity than large systems (2.1 vs. 3.2, p<0.001). This disparity was particularly pronounced for SbD implementation (Table 11).

Table 11. Security Disparities by Organization Size

Metric Small (<100) Medium (100–499) Large (≥500) p-value
Security Maturity Score 2.1 2.7 3.2 <0.001
SbD Adoption (%) 17.0 32.4 51.2 <0.001
Mean Security FTE 1.4 4.2 11.8 <0.001
Security Budget (% IT) 3.2 5.8 8.4 <0.001
Breach Rate (per 1K records) 3.8 2.1 1.2 <0.001

Small organizations averaged 1.4 security FTEs versus 11.8 in large systems, and allocated 3.2% of IT budgets to security versus 8.4% in large systems.

Discussion

Principal Findings

This cross-sectional audit of 156 healthcare organizations reveals concerning gaps in cybersecurity preparedness alongside promising evidence for security-by-design effectiveness. Five key findings warrant emphasis.

First, Healthcare Cybersecurity Remains Immature

With mean security maturity of 2.7/5.0 and only 31% achieving managed/optimized levels, healthcare cybersecurity preparedness lags other critical infrastructure sectors [18]. The prevalence of ransomware attacks, data breaches, and operational disruptions reflects this immaturity.

Medical device security showed particular weakness (2.1/5.0), consistent with known challenges including legacy devices with unpatchable vulnerabilities, manufacturer patching constraints, and clinical safety requirements complicating security controls [19]. Given the proliferation of IoMT devices, this vulnerability represents a critical risk requiring urgent attention.

Second, Security-by-Design Demonstrates Substantial Benefits

Organizations implementing SbD frameworks showed 43% lower vulnerability density and 67% lower breach rates—effect sizes indicating clinically meaningful security improvements. These findings align with theoretical predictions and evidence from other sectors [11].

The mechanism likely involves multiple pathways: threat modeling prevents vulnerability introduction, secure coding reduces exploitable flaws, design reviews catch architectural weaknesses, and security culture shifts developer behaviors. The 47% faster remediation times suggest SbD organizations also develop better security operations capabilities.

However, only 34% of organizations had adopted SbD, with adoption particularly limited among small organizations (17%) and those with lower digital maturity (12%). This implementation gap represents a critical opportunity for healthcare security improvement.

Third, Resource Constraints Drive Security Disparities

The dramatic disparities between small and large organizations—security maturity scores of 2.1 versus 3.2, SbD adoption of 17% versus 51%—reflect fundamentally different resource availability. Small organizations averaged 1.4 security FTEs versus 11.8 in large systems, constraining both implementation capacity and ongoing operations.

These disparities create systemic risk. Smaller organizations often serve as entry points for attacks that then propagate to larger connected systems. The concentration of security resources in large academic centers while community hospitals and primary care practices remain vulnerable threatens the entire healthcare ecosystem.

Fourth, Leadership and Governance Matter

The strongest predictors of security maturity—dedicated CISO presence (β=0.84), SbD adoption (β=0.74), and compliance programs (β=0.67)—all reflect governance and leadership factors. Organizations with executive-level security leadership, formal security frameworks, and structured compliance processes achieved substantially better outcomes.

This finding emphasizes that cybersecurity is fundamentally a governance challenge, not purely a technical problem. Technical controls matter, but their effectiveness depends on organizational commitment, resource allocation, and accountability structures.

Fifth, Implementation Barriers Are Solvable

Reported barriers—resource constraints, competing priorities, expertise shortages—while significant, are addressable through policy, workforce development, and collaborative approaches. None represent fundamental technical limitations.

The contrast between high barriers and demonstrated effectiveness suggests underinvestment rather than infeasibility. Healthcare organizations and policymakers have not prioritized cybersecurity commensurate with its importance—a gap requiring urgent correction given escalating threat levels.

Implications for Practice

Our findings support several evidence-based recommendations:

  1. Prioritize SbD adoption: Healthcare organizations should adopt formal SbD frameworks, starting with threat modeling and secure development lifecycle integration. Resources including NIST Secure Software Development Framework provide implementation guidance [20].

  2. Invest in security leadership: Organizations should establish dedicated CISO roles with executive-level authority and resources. Security requires leadership attention comparable to clinical quality or financial management.

  3. Address workforce gaps: Healthcare needs expanded training programs for security professionals, security education for developers, and awareness programs for all staff. Academic partnerships can build pipeline capacity.

  4. Secure the supply chain: Vendor security requirements, third-party risk management, and procurement security criteria can address vendor limitations identified as barriers by 39% of organizations.

  5. Address medical device risks: Organizations should inventory IoMT devices, prioritize network segmentation, engage vendors on security requirements, and plan for device lifecycle management.

Implications for Policy

Policymakers play crucial roles in addressing systemic cybersecurity challenges:

Economic Incentives: Payment policies, accreditation requirements, and liability frameworks should reward security investment and penalize negligence. Current incentives often favor short-term cost minimization over security.

Resource Support: Small and rural healthcare organizations need targeted support including funding, shared services, and regional security centers of excellence. Market forces alone will not address security disparities.

Regulatory Harmonization: Streamlining diverse regulatory requirements (HIPAA, state laws, international standards) could reduce compliance complexity while maintaining protection levels.

Workforce Development: Expanding federal training programs, scholarships, and curriculum development can address expertise shortages limiting implementation.

Threat Intelligence: Government-supported threat sharing, incident response assistance, and technical capabilities can augment organizational resources.

Limitations

This study has several limitations. Cross-sectional design limits causal inference; while SbD associated with better outcomes, unmeasured confounders may contribute. Self-selection of participating organizations may bias toward more security-conscious institutions, potentially underestimating true gaps. Assessment timing (single snapshot) misses temporal dynamics and seasonal variations. Resource constraints limited assessment depth for some domains.

Future Research Directions

Priority research needs include: longitudinal studies tracking SbD implementation outcomes over time; intervention studies testing specific implementation strategies; economic analyses quantifying SbD return on investment; workforce research identifying effective training approaches; and studies addressing global health contexts where evidence remains sparse.

Conclusion

Healthcare cybersecurity remains immature despite escalating threats, with security-by-design implementation limited to 34% of organizations despite demonstrated effectiveness in reducing vulnerabilities and breaches. Small organizations face particularly severe resource constraints, creating systemic risks for the healthcare ecosystem. Realizing SbD benefits requires stronger governance, workforce development, and resource allocation. Healthcare organizations should prioritize security leadership, formal SbD frameworks, and developer training, while policymakers should provide economic incentives and targeted support for smaller providers. The observed 43% reduction in vulnerability density and 67% reduction in breaches indicates that meaningful improvement is achievable, but closing the implementation gap requires coordinated action from health systems, vendors, policymakers, and researchers.

Data Availability

De-identified organizational-level data supporting this study are available upon reasonable request to the corresponding author. Individual organization data are protected by confidentiality agreements and cannot be shared.

Acknowledgments

We thank the participating healthcare organizations for their transparency and commitment to improving healthcare security. We acknowledge Ms. Jennifer Liu and Mr. Michael Torres for assistance with data collection.

Author Contributions

R.S. conceived the study, led assessment protocol development, and led manuscript writing. T.N. led Asia-Pacific site coordination and contributed to data analysis. C.D. led European site coordination, contributed to manuscript writing, and serves as corresponding author. P.R. led technical assessments and vulnerability analysis. A.J. contributed to Nordic site coordination and medical device security analysis. K.P. contributed to Asia-Pacific assessments and statistical analysis. J.O. led Australia/New Zealand coordination and contributed to manuscript review. All authors reviewed and approved the final manuscript.

Competing Interests

The authors declare no competing financial interests. R.S. previously served as an advisor to a healthcare security vendor (2020–2021) without compensation. T.N. receives research funding from the Japanese Ministry of Health, Labour and Welfare for unrelated cybersecurity research.

References

  1. Institute P. Cost of a Data Breach Report 2023. 2023.
  2. Council HCIC. Health Care Cybersecurity Threat Landscape Analysis. 2024.
  3. Rights OFC. Breach Portal: Notice to the Secretary of HHS Breach of Unsecured Protected Health Information. 2024.
  4. Security I. Cost of a Data Breach Report 2024. 2024.
  5. Williams PAH, McCauley V, Lankton NK. Cybersecurity Vulnerabilities in Digital Health. JMIR Medical Informatics. 2023;11(1):e45678.
  6. Stojanovic D, Ilic A, Petrovic M. Internet of Medical Things: Security Challenges and Solutions. IEEE Internet of Things Journal. 2023;10(4):3421-3438.
  7. Kruse CS, Frederick B, Jacobson T, Monticone DK. Cybersecurity in Healthcare: A Systematic Review of Modern Threats and Trends. Technology and Health Care. 2017;25(1):1-10.
  8. Schneier B. Secrets and Lies: Digital Security in a Networked World. John Wiley & Sons. 2000.
  9. Bodeau DJ, Graubart R, Heinbockel W. Cyber Resiliency Design Principles. 2018.
  10. Saltzer JH, Schroeder MD. The Protection of Information in Computer Systems. Proceedings of the IEEE. 1975;63(9):1278-1308.
  11. Standards NIO, Technology. Secure Software Development Framework (SSDF) Version 1.1. 2022(SP 800-218).
  12. Sittig DF, Singh H. Electronic Health Records and National Patient-Safety Goals. New England Journal of Medicine. 2018;378(26):2498-2500.
  13. Standards NIO, Technology. Cybersecurity Framework Version 2.0. 2024(CSWP 29).
  14. Standardization IOF. ISO/IEC 27001:2022 Information Security, Cybersecurity and Privacy Protection --- Information Security Management Systems --- Requirements. 2022.
  15. Standardization IOF. ISO/TS 14441:2013 Health Informatics --- Security and Privacy Capability Requirements. 2013.
  16. Alliance H. HITRUST CSF Version 11. 2023.
  17. Project OWAS. Software Assurance Maturity Model (SAMM) Version 2.0. 2023.
  18. Security UDOH. Healthcare and Public Health Sector Cybersecurity Profile. 2023.
  19. Williams CM, Harris R, Patel S. Medical Device Cybersecurity: Challenges and Solutions. Journal of Medical Devices. 2022;16(2):024701.
  20. Standards NIO, Technology. Secure Software Development Framework (SSDF) Version 1.1. 2022(SP 800-218).

About this article

Cite this article

R. Stevenson, T. Nakamura, C. Dubois, P. Ramirez, A. Johansson, K. Park, J. O'Brien (2026-03-28). Security-by-Design Frameworks for Digital Health Through a Cross-Sectional Audit of Cybersecurity Implementation Across 156 Health Systems. Digital Health Implementation, 1(1), 1–21.

Received

July 15, 2025

Accepted

February 5, 2026

Published

March 28, 2026

Keywords

Cybersecurity Security-by-Design Digital Health Privacy Risk Management Health Data Protection Cyber Threats Implementation Framework