A prospect sends over their security questionnaire. Buried on page 4: “Do you have an acceptable use policy? Please provide a copy.” You know you have something. You just have to find it.
An acceptable use policy (AUP) is a formal document that defines how employees, contractors, and authorised users may use company IT systems, networks, devices, and data. It sets out what is permitted, what is prohibited, and what happens when the rules are broken. Every company with employees and IT infrastructure needs one — not because it is bureaucracy, but because without it, there are no documented rules for people to follow.
AUP ownership typically sits with the Head of Security, IT Lead, or CTO. The policy applies to everyone with access to company systems: employees, contractors, interns, and any third party connecting to your network or handling your data.
The frameworks that expect an AUP include:
- SOC 2 (CC6.1, CC6.3): logical access controls and permitted use of systems
- ISO 27001 (A.8.1, A.8.4 — 2022 edition): acceptable use rules for information assets
- HIPAA (§164.308(a)(3), §164.310(b)): workstation use policy and workforce authorisation
- GDPR (Articles 5 and 32): accountability and appropriate processing of personal data
- CMMC (AC.1.001, AC.1.002): limiting access to authorised users and authorised transaction types
By the end of this guide, you will know exactly what to include in an acceptable use policy, have a free template you can use today, and understand what auditors actually look for when they review it.
Here is what I will cover:
- What an acceptable use policy is and how it differs from a terms-of-service
- What every AUP must include, including AI tools coverage
- A free, complete AUP template you can customise and distribute
- How to roll it out and collect acknowledgements
- What SOC 2, ISO 27001, HIPAA, and GDPR require from your AUP
- The evidence auditors will ask for
- The mistakes that get companies flagged in audits
Acceptable Use Policy (AUP), Explained
A terms-of-service governs how your customers may use your product. An acceptable use policy governs how your employees and contractors may use your systems. They are different documents, aimed at different audiences, with different legal implications.
The AUP is an internal document. It applies the moment someone logs into a company laptop, opens a work email, or connects to the corporate network. It is not aspirational guidance — it is a set of enforceable rules.
Acceptable use policy definition: what it covers and what it doesn’t
In scope for a standard AUP:
- All company-owned and managed systems: laptops, mobile devices, servers, cloud services, SaaS applications, email accounts, and network infrastructure
- Personal devices used to access company systems (BYOD)
- Remote and hybrid working environments
Not in scope: how your customers or the public interact with your product. That is covered by your terms-of-service and privacy policy.
Who owns an acceptable use policy
In most organisations, the Head of Security or IT Lead owns the policy and is responsible for keeping it current, distributing it to all covered personnel, and collecting acknowledgements. In smaller companies, this role often falls to the CTO or the founder running the compliance effort.
Ownership means accountability: the policy owner is the person an auditor will ask about when the policy was last reviewed, who has signed it, and what changed in the last version.
Acceptable use policy in cyber security
In a security context, the AUP is a foundational human-layer control. Technical controls — firewalls, MFA, endpoint detection — govern what systems will and will not allow. The AUP governs what people will and will not do.
Without it, there are no formal rules for employees to violate. That creates real gaps: SOC 2’s CC6.1 requires documented access controls that cover permitted use; ISO 27001’s A.8.1 requires acceptable use rules for information assets. If those rules do not exist in writing, the control cannot be evidenced.
The Real Risks of Skipping an Acceptable Use Policy
A startup I talked to failed its SOC 2 readiness assessment on CC6.1. Not because they did not have a policy — they did. Because they had emailed it to the team two years ago and never collected a single acknowledgement. Auditors treated it as if it did not exist.
That is the compliance risk in concrete form. But there are two others.
Security risk. Without a clear AUP, employees make their own judgements. They store files in personal Dropbox accounts because it is convenient. They share a login with a colleague to save time. They paste a customer’s data into a public AI model because no one ever said they could not. None of these are malicious. All of them create real exposure.
Legal and HR risk. If an employee misuses company systems — leaks data, harasses a colleague over company email, runs a side business on company hardware — and there is no AUP, the company has limited recourse. No documented rules, no enforcement mechanism. The policy is what turns “that was wrong” into “that was a clear violation of a policy you signed.”
The importance of an acceptable use policy sits at the intersection of all three. It is a security control, a compliance document, and a legal instrument at the same time.
Which Teams and Companies Need an Acceptable Use Policy?
If you have employees and IT systems, you need one. There is no size threshold where this stops applying.
The situations that make it urgent:
- Pursuing SOC 2 or ISO 27001: auditors will ask for it, and they will ask to see signed acknowledgements
- Handling PHI: HIPAA requires a workstation use policy for every workforce member with access to protected health information
- BYOD or hybrid workforces: personal devices accessing company systems need to be governed explicitly
- Selling to enterprise customers: procurement teams and security questionnaires will ask for your AUP
Acceptable use policy for employees
Every employee, contractor, intern, and third-party vendor with system access should be covered. The policy must name them explicitly — “everyone knows” is not a documented control.
What auditors care about most is not the length of the policy. It is whether every covered person has acknowledged it. A one-page AUP that every employee has signed is worth more to an auditor than a 20-page one that nobody has seen.
Acceptable use policy for workplace technology
The AUP must address the actual technology environment your team uses. Many policies written before 2020 are silent on cloud storage, SaaS tools, remote access, and AI. That is a problem: the gap between what the policy covers and how people actually work is exactly where auditors look.
If employees use personal devices to access company systems, the policy must say so and reference the BYOD Policy. If you use cloud-based tools, the policy must address approved vs. unapproved services. If your team uses AI tools, that needs its own section — and if AI use is significant, a standalone AI governance policy is worth adding to the stack.
What to Include in an Acceptable Use Policy
A common mistake is writing an AUP that says “use company systems appropriately” and calling it done. That is not a policy — it is an aspiration. The prohibited activities section must be specific enough that an employee could look at their current behaviour and know whether they are in compliance.
Core acceptable use policy components
| Component | What to cover |
|---|---|
| Purpose & Scope | Who and what systems this covers: employees, contractors, third parties; company-managed and BYOD devices |
| Authorised Use | Permitted activities: business use as primary purpose; limited personal use if explicitly allowed |
| Prohibited Activities | Named, specific banned behaviours: credential sharing, data exfiltration, unlicensed software, shadow IT, illegal content, pasting confidential data into unapproved AI tools |
| Device and Network Use | Company vs personal devices, VPN requirements, public Wi-Fi restrictions |
| Internet and Email Use | Acceptable browsing, personal email on work devices, phishing awareness obligations |
| Cloud and SaaS Use | Approved tools list; prohibition on storing company data in personal cloud accounts; shadow IT ban |
| AI Tools | Approved AI tools; data restrictions for public models; output review requirements |
| Data Handling | Classification-based rules; reference to data classification policy |
| Social Media | Rules on disclosing company information from personal accounts |
| Monitoring Notice | Statement that company systems may be monitored: legally required in many jurisdictions |
| Enforcement | Consequences: disciplinary action up to termination and legal referral |
| Exceptions Process | How to request an exception, who approves, written approval required |
| Review Cadence | Annual minimum; review after major incidents or tech changes |
| Acknowledgement | Signed or digital acknowledgement required from all covered personnel |
Prohibited activities: what must be explicit
The prohibited activities section is where vague AUPs break down in practice and in audits.
“Do not misuse company systems” is not enforceable. An employee who stored customer data in their personal Google Drive can argue — credibly — that they did not think of that as “misuse.” The policy must name the behaviour: storing confidential data in unapproved locations is prohibited. Entering confidential company data, customer data, source code, or PII into unapproved AI tools is prohibited. Sharing login credentials with any other person is prohibited.
Specificity is what makes the policy legally enforceable and what makes it pass an audit.
Monitoring and privacy notice
Many teams skip the monitoring notice or bury it in legal language. Do not.
In the UK and EU, employees must be explicitly informed that company systems may be monitored. Several US states have similar requirements. Without this notice, your monitoring activities may themselves be unlawful — and the notice is what removes any reasonable expectation of privacy on company-owned systems. A single clear sentence is enough: “By using company systems, you acknowledge that activity may be monitored for security and compliance purposes.”
Free Acceptable Use Policy Template (+ PDF Download)
Acceptable use policy example: how to use this template
This is a complete, pre-populated AUP you can customise and distribute. It covers AI tools, BYOD, monitoring, and the 15-component structure from Section 5. Every section has best-practice content already filled in — replace the bracketed fields with your company’s specifics.
This acceptable use policy sample is designed to work for a 5-person startup and a 500-person company. The AI tools and BYOD sections are included because they are now expected by auditors reviewing SOC 2, ISO 27001, and HIPAA compliance.
PDF and Word versions are available for download.
ACCEPTABLE USE POLICY
Policy Owner: [Name / Role — e.g. Head of Security, CTO] Approved By: [Name / Role — e.g. CEO] Effective Date: [Date] Next Review Date: [Date — recommended: 12 months from effective date] Version: 1.0
1. Purpose
[Company Name] provides information systems, networks, devices, and cloud services to support business operations. This Acceptable Use Policy (AUP) defines the rules that all authorised users must follow when accessing these resources. The policy exists to protect the confidentiality, integrity, and availability of company systems and data, to meet compliance obligations, and to provide a clear basis for enforcement when rules are violated.
2. Scope
| Resource Type | In Scope | Notes |
|---|---|---|
| Company-issued laptops and desktops | Yes | All devices provisioned by IT |
| Company-issued mobile devices | Yes | Including tablets |
| Personal devices (BYOD) | Yes | When used to access company systems or data |
| Corporate network and VPN | Yes | Including remote access |
| Cloud services and SaaS applications | Yes | Approved tools only |
| Company email accounts | Yes | |
| Servers and cloud infrastructure | Yes |
This policy applies to all employees, contractors, consultants, interns, and any other persons authorised to access company systems or data.
3. Roles and Responsibilities
| Role | Responsibility |
|---|---|
| Policy Owner ([Name/Role]) | Maintain, update, and distribute this policy; collect acknowledgements; manage exceptions |
| All Covered Personnel | Read, acknowledge, and comply with this policy; report violations |
| IT / Security Team | Enforce technical controls; monitor for violations; process exception requests |
| Managers and Team Leads | Ensure direct reports have acknowledged the policy; report violations to IT/Security |
| HR | Include policy acknowledgement in onboarding; support enforcement actions |
4. Authorised Use
Company IT resources are provided primarily for business purposes. Limited personal use is permitted provided it:
- Does not interfere with business operations or consume excessive resources
- Does not violate any provision of this policy or applicable law
- Does not create a security, legal, or reputational risk for [Company Name]
5. Prohibited Activities
The following activities are expressly prohibited for all covered personnel:
- Sharing login credentials with any other person, including colleagues
- Accessing, downloading, or distributing illegal, offensive, discriminatory, or harassing content
- Installing or using unlicensed, unauthorised, or unapproved software on any company or BYOD device
- Attempting to access systems, accounts, data, or network resources without authorisation
- Circumventing security controls, content filters, endpoint agents, or monitoring systems
- Transmitting or storing confidential company data, customer data, or source code in personal accounts or unapproved services
- Using company systems to conduct personal business activities for financial gain
- Using company email or accounts to send spam, phishing messages, or unsolicited commercial communications
- Entering confidential company data, customer data, PII, or source code into unapproved AI tools or public language models
- Accessing or storing illegal material of any kind on company systems
6. Device and Network Use
Company-issued devices are provided for business use. Personal use must be minimal and must not create security or compliance risks.
- When working remotely or outside a trusted network, employees must use the [Company Name] VPN to access internal systems or sensitive data
- Public or unsecured Wi-Fi networks must not be used to access sensitive data without active VPN protection
- Personal devices (BYOD) used to access company systems must comply with the [Company Name] BYOD Policy, including minimum OS version, screen lock, and remote-wipe capability requirements
- Employees must not connect unapproved personal storage devices (USB drives, external hard drives) to company-issued systems
7. Internet and Email Use
- Internet access is provided primarily for business use. Incidental personal use during working hours is permitted.
- Employees must not access websites that are illegal, offensive, or blocked by [Company Name] content filtering systems
- Email sent from [Company Name] accounts must comply with this policy and applicable law. Employees must not use company email addresses for personal mailing lists, subscriptions, or communications unrelated to work
- Employees must follow security awareness training guidance on identifying and reporting phishing attempts and suspicious links. Suspected phishing emails must be reported to [IT Security contact or alias]
8. Cloud Services and SaaS Applications
- Employees must only use cloud services and SaaS tools that have been approved by IT or Security. The current approved tools list is maintained at [location — e.g. internal wiki, IT ticketing system]
- Company data — including customer records, source code, financial data, and internal documents — must not be stored in personal cloud accounts (e.g. personal Google Drive, Dropbox, iCloud)
- Shadow IT — using unapproved applications to process, store, or transmit company data — is prohibited. If you need a tool that is not on the approved list, submit a request to IT/Security for evaluation
9. AI Tools
- Employees must only use AI tools (including large language models such as ChatGPT, Microsoft Copilot, Google Gemini, or xAI Grok) that have been approved by IT or Security
- The following types of data must not be entered into public or unapproved AI tools under any circumstances: confidential company data, customer data, personally identifiable information (PII), source code, financial information, or any data classified as Confidential or Restricted under the [Company Name] Data Classification Policy
- AI-generated content used in customer-facing communications, product documentation, or public-facing materials must be reviewed and approved by a human before publication or distribution
- Employees must not use AI tools to impersonate another person, generate deceptive or misleading content, or attempt to circumvent security controls or policy requirements
10. Data Handling
All data must be handled in accordance with the [Company Name] Data Classification Policy. Employees handling Confidential or Restricted data must:
- Store it only in approved locations
- Transmit it only via approved, encrypted channels
- Share it only with individuals who have a documented business need
Contact [Policy Owner / IT Security] if you are unsure how to handle a specific type of data.
11. Social Media
Employees must not disclose the following on personal or professional social media accounts, regardless of privacy settings:
- Confidential company information or internal business strategy
- Customer names, data, or details of customer engagements
- Unreleased product features, roadmap details, or technical architecture
- Financial information, funding status, or pending transactions
- Personal information about colleagues without their explicit consent
12. Monitoring and Privacy
[Company Name] reserves the right to monitor activity on its information systems, networks, and devices for security, compliance, and operational purposes. This includes network traffic, email, file transfers, and application usage on company-owned systems.
Users of company systems and of personal devices connecting to company resources should have no expectation of privacy in their use of those systems. Monitoring may be conducted without prior notice.
13. Enforcement
Violations of this policy will be assessed by IT/Security in coordination with HR and Legal. Consequences may include:
- Formal written warning
- Revocation of system access privileges
- Disciplinary action up to and including termination of employment or contract
- Referral to law enforcement where violations involve illegal activity
The severity of the consequence will be proportionate to the nature and impact of the violation.
14. Exceptions
Requests for exceptions to any provision of this policy must be:
- Submitted in writing to [Policy Owner / IT Security]
- Supported by a documented business justification
- Approved in writing by [Policy Owner] before the exception is used
Approved exceptions must specify: the scope of the exception, the duration, the alternative control in place, and the risk owner. Verbal exceptions are not valid. Exceptions will be reviewed at each annual policy review.
15. Review and Updates
This policy will be reviewed at least annually. Out-of-cycle reviews will be triggered by:
- A significant security incident related to user behaviour
- Material changes to the technology environment (new platforms, AI tools, or cloud migrations)
- Significant workforce changes (rapid headcount growth, new remote working model)
- Changes to applicable laws, regulations, or compliance framework requirements
16. Version History
| Version | Date | Author | Changes |
|---|---|---|---|
| 1.0 | [Date] | [Name] | Initial version |
17. Acknowledgement
By using [Company Name]’s systems and resources, I confirm that I have read, understood, and agree to comply with this Acceptable Use Policy. I understand that violations may result in disciplinary action.
Signature: _________________________ Date: ____________
Name (print): ______________________ Role: ____________
How to Write and Roll Out Your Acceptable Use Policy
Having a policy is step one. Getting it signed by everyone and keeping the evidence is what actually makes it work in an audit.
- Assign a policy owner. The Head of Security, IT Lead, or CTO owns it and is accountable for keeping it current.
- Inventory your scope. List every system, cloud tool, and device category employees access. This becomes the scope section.
- Draft from the template. Customise the prohibited activities section for your specific tech stack. Do not leave the AI tools or BYOD sections blank.
- Legal review. The monitoring notice requirements vary by jurisdiction. If you operate in the UK, EU, or certain US states, have a lawyer review that section before distributing.
- Leadership approval. Get documented sign-off from the CEO and Head of Security. Version the document from day one.
- Distribute and collect acknowledgements. Send to every employee, contractor, and third party with system access. Require a signed or digital acknowledgement from each. This is what auditors will ask for.
- Map to compliance controls. Link each clause to the relevant SOC 2, ISO 27001, HIPAA, or GDPR control in your compliance tool.
- Collect your evidence. Signed acknowledgements, distribution log, policy version history. Keep these somewhere you can retrieve them quickly during an audit.
- Set a review reminder. Annual at minimum. Update the moment you adopt new AI tools, a new cloud platform, or a significant remote working arrangement.
- Update after incidents. If a policy violation occurs, review whether the policy was specific enough. Vague wording that allowed a violation is a policy problem, not just an HR problem.
Using an acceptable use policy generator
An acceptable use policy generator — whether that is a template, a compliance platform, or an AI tool — is a valid starting point. The template above is a complete generator output you can work from directly.
The risk is treating the output as final. Most generator outputs predate AI tools and BYOD as dominant concerns. Before distributing anything generated, check: does it explicitly cover AI tools? BYOD? Remote access? The monitoring notice? If the answer to any of these is no, the policy is not ready.
AI Acceptable Use Policy: What to Add Right Now
Here is a number worth knowing: in a 2024 survey, more than 75% of knowledge workers reported using AI tools at work. The majority of companies with AUPs written before 2023 have zero coverage of this. That is a material gap — not a technicality.
When an employee pastes a customer’s PII into ChatGPT, the question is not “was that wrong?” It is “did you tell them it was prohibited?” Without an explicit AI section in your AUP, the answer is no.
The AI section in the template above is a minimum. If your team uses AI tools heavily or in sensitive workflows, consider a standalone AI governance policy that addresses procurement, model risk assessment, and output review in more detail.
What to include in an AI acceptable use policy
- Approved vs. unapproved tools: maintain a current approved tools list. Require IT approval before using any new AI tool for company work.
- Data restrictions: confidential data, customer data, source code, and PII must not be entered into public AI models. This should be stated explicitly, not implied.
- Output review requirements: AI-generated content in customer-facing materials must be reviewed by a human before use. The policy should say this plainly.
- IP and attribution: who owns content co-created with AI tools, and what disclosure obligations apply.
- Prohibited uses: using AI to impersonate another person, generate deceptive content, or attempt to circumvent security controls.
AI acceptable use policy examples
Concrete examples help employees understand the line:
| Scenario | Permitted? |
|---|---|
| Using Copilot for code suggestions on an internal project (no customer data in scope) | Yes, if Copilot is on the approved tools list |
| Pasting a customer’s database schema or PII into a public ChatGPT session | No — expressly prohibited |
| Using an enterprise AI subscription with data protection enabled, approved by IT | Yes |
| Using a personal AI subscription for company work without IT sign-off | No |
| Asking an AI tool to draft internal documentation, then reviewing it before publishing | Yes |
| Asking an AI tool to draft a customer-facing email and sending it without review | No |
Acceptable Use Policy Requirements Across SOC 2, ISO 27001, HIPAA, and GDPR
The AUP sits at the intersection of most major compliance frameworks. Here is what each actually requires:
| Framework | Relevant control | What is expected |
|---|---|---|
| SOC 2 | CC6.1, CC6.3 | Logical access controls; the AUP defines permitted use of systems and the rules that govern access for all users |
| ISO 27001 | A.8.1, A.8.4 (2022 edition) | Rules for acceptable use of information and associated assets must be documented, communicated, and implemented |
| HIPAA Security Rule | §164.308(a)(3), §164.310(b) | Workforce authorisation procedures; a workstation use policy is required for every workforce member with PHI access |
| GDPR Article 32 | Article 5, Article 32 | Data minimisation and appropriate processing; the AUP supports the accountability principle and helps demonstrate appropriate technical and organisational measures |
| CMMC | AC.1.001, AC.1.002 | Limit system access to authorised users and to the specific types of transactions those users are authorised to perform |
A few things worth noting. ISO 27001’s 2022 edition updated the control numbering (A.8.1 replaces A.8.1.3 from the 2013 edition). SOC 2 does not prescribe the exact content of an AUP, but it expects the policy to be documented, distributed, and acknowledged — auditors check all three. HIPAA explicitly requires a workstation use policy for every workforce member who may encounter PHI, which is broader than most teams assume.
What auditors actually check
When an auditor reviews your AUP, they are not just reading the document. They are checking:
- The policy itself: current version, policy owner named, effective date, review date
- Acknowledgements: signed records from every current employee and contractor — not just some, all
- Distribution evidence: proof it was sent to personnel, not just uploaded to a shared drive
- Version history: when it was last reviewed, what changed, and who approved the change
- Onboarding integration: whether the policy is part of the new hire onboarding process
- For HIPAA: documentation that all workforce members with PHI access have been trained on workstation use requirements
What Auditors Look for When Reviewing Your AUP
The policy document is the starting point. The evidence is what matters.
| Record type | What it should show |
|---|---|
| Signed acknowledgements | Digital or wet signature from every covered employee and contractor, with date; must be current, not historical |
| Distribution log | Record that the policy was actively sent to all covered personnel — not just made available |
| Policy version history | Version number, change log, and documented approval per version |
| Training records | If AUP content is covered in security awareness training, completion logs tied to specific training sessions |
| Review records | Evidence that annual review occurred: approval sign-off, dated version bump, or documented review meeting |
| Incident records | Any policy violations documented and resolved, with outcome noted |
| Exception approvals | Written, approved exceptions with approver identity, justification, alternative control, and expiry date |
The signed acknowledgements are the single most commonly missing piece. Teams send the policy by email and assume that is sufficient. It is not. An email sent is not an acknowledgement received. You need a record that shows each person received, read, and agreed to the policy — ideally with a timestamp.
Acceptable Use Policy Mistakes That Create Audit and Legal Risk
Mistake 1: Vague prohibited activities. Writing “do not misuse company systems” instead of naming specific behaviours. When enforcement is needed — or when an auditor reads it — vague common acceptable use policy statements do not hold up. Name the behaviours: credential sharing, shadow IT, pasting PII into AI tools.
Mistake 2: No monitoring notice. Leaving out the privacy clause. In the UK, EU, and several US states, employees must be explicitly informed that their activity on company systems may be monitored. Without this, your monitoring may be the thing that is illegal.
Mistake 3: No AI tools coverage. Most AUPs written before 2023 say nothing about generative AI or large language models. Employees are using them. If the policy does not address them, there are no rules to enforce.
Mistake 4: Acknowledgements never collected. The policy was emailed to the team once. No one signed anything. When the auditor asks to see acknowledgements, there is nothing to show. This single gap fails the relevant control in SOC 2 and ISO 27001.
Mistake 5: Treating it as a one-time document. A policy drafted in 2019 does not cover remote working at scale, cloud proliferation, BYOD, or AI. Version history with a 4-year-old last review date is a red flag in any audit.
Mistake 6: No exception process. Without a formal exception mechanism, employees work around the policy informally. Informal exceptions have no paper trail. When an auditor asks about exceptions, you have nothing to show.
Mistake 7: BYOD not addressed. If employees use personal devices and the AUP says nothing about them, you have an ungoverned access surface. That gap will be flagged in a SOC 2 readiness review and will be found in any serious security incident investigation.
Scaling Your Acceptable Use Policy: Startup to Enterprise
Whether you need an internal employee policy or a website acceptable use policy for your platform’s users, the governance requirements are the same: specific rules, documented acknowledgement, regular review. The depth varies by company stage.
Early-stage startups (1–25 employees)
Start with the template above. Keep it at one page if you can. The sections you absolutely need: prohibited activities (specific), device and network rules, AI tools, data handling basics, monitoring notice, and acknowledgement.
A 1-page signed AUP is worth more to an auditor than a 20-page unsigned one. Do not let perfection delay distribution. Get it out, get it signed during onboarding for every hire and contractor, and review it annually.
Growing companies (25–200 employees)
Add dedicated sections for cloud and SaaS governance, BYOD, remote access, and AI tools. Reference your BYOD Policy and Remote Access Policy directly rather than duplicating their content in the AUP.
At this stage, tracking acknowledgements in a spreadsheet stops working. Move to a system — whether that is your HR tool, your compliance platform, or dedicated policy management software — that records acknowledgements with timestamps automatically.
Larger enterprises (200+ employees)
The AUP becomes the umbrella document that references a suite of supporting policies: BYOD Policy, Remote Access Policy, AI Governance Policy, Data Classification Policy. The AUP itself states the principles and rules; the supporting policies provide the detail.
Role-based variations may be needed for departments with elevated access: engineering teams with production infrastructure access, finance teams with financial data, legal with privileged communications. The AUP should note that department-specific addenda exist and where to find them.
How ComplyJet Keeps Your Acceptable Use Policy Audit-Ready
Writing a good acceptable use policy is the easy part. The hard part is keeping it current, distributed, signed by everyone, and evidenced — year after year, across a growing team.
ComplyJet includes a pre-built acceptable use policy template mapped to SOC 2, ISO 27001, HIPAA, and GDPR controls. You customise it in the platform and publish it directly to your team. When you send the policy for acknowledgement, every response is logged with a timestamp. No spreadsheets, no chasing people on Slack.
Version history is tracked automatically. When the annual review comes around, the system reminds the policy owner and logs the review once it is complete. Each clause is linked to the relevant compliance control, so auditors can trace policy to evidence without any manual work on your part.
When an auditor asks for your AUP evidence, you export the acknowledgement log, review history, and current policy version in one step.
FAQs
What is an acceptable use policy?
An acceptable use policy (AUP) is a document that defines how employees and authorised users may use company IT systems, networks, devices, and data. It sets out what is permitted, what is prohibited, who owns the policy, and what happens when the rules are broken.
What is the purpose of an acceptable use policy?
The AUP serves three purposes at once: it protects the company from security breaches caused by employee behaviour; it satisfies compliance requirements under SOC 2, ISO 27001, HIPAA, and GDPR; and it gives the company a documented, enforceable basis for taking disciplinary action when rules are violated.
What should be included in an acceptable use policy?
At minimum: purpose, scope, authorised use, prohibited activities (specific, not vague), device and network rules, cloud and SaaS governance, AI tools coverage, data handling obligations, a monitoring and privacy notice, enforcement consequences, an exceptions process, and an acknowledgement section. If your team uses personal devices, BYOD coverage is also required.
Why is an acceptable use policy important?
Without one, employees have no formal rules to follow and the company has no enforcement mechanism when something goes wrong. It is also a required or expected document under SOC 2 (CC6.1), ISO 27001 (A.8.1), HIPAA (§164.310(b)), GDPR (Article 32), and CMMC (AC.1.001).
What is an acceptable use policy in cyber security?
In a security context, the AUP is a human-layer control. Technical controls govern what systems allow. The AUP governs what people are required to do. It is how you operationalise access rules, data handling standards, and security awareness obligations at the individual level — and it is the evidence that those rules existed when auditors ask.
What should our AI acceptable use policy include?
Which AI tools are approved for use; what data types cannot be entered into public AI models (PII, confidential data, source code, customer data); output review requirements for any AI-generated content used externally; IP and attribution obligations; and explicit prohibitions on using AI to impersonate, generate deceptive content, or bypass security controls.
How often should an AUP be reviewed?
At minimum, annually. Review it immediately after significant security incidents, major workforce changes, the adoption of new AI tools or cloud platforms, or changes to applicable laws or compliance framework requirements.
Do employees need to sign the acceptable use policy?
Yes. SOC 2, ISO 27001, and HIPAA auditors all expect signed or digitally acknowledged AUPs from every covered employee and contractor. The signature is not a formality — it is the evidence that the person received the policy and agreed to comply with it. Without it, the policy is unenforceable and the control is typically failed in an audit.
Related Policies
Remote Access Policy: governs how employees securely connect to company systems from outside the office. The AUP’s device and network rules reference this directly — your remote access policy sets the technical requirements that the AUP obliges people to follow.
BYOD Policy: defines the rules for personal devices used to access company systems. Every AUP that covers BYOD should reference this policy explicitly rather than duplicating its content.
Data Classification Policy: defines what counts as confidential, sensitive, or public data. The AUP enforces the handling rules that flow from classification — without the classification policy, the AUP’s data handling section has no reference point.
Remote Working Security Policy: extends AUP obligations for employees working outside a traditional office. Where the AUP states the rules, the remote working security policy provides the controls that support them.






