AP CSP Day 52: Privacy, PII, & Ethical Computing | Cycle 2

Key Concepts

The right to be forgotten (data erasure) conflicts with legitimate interests in maintaining accurate historical records and enabling fraud detection using transaction history. Digital permanence means that data shared online may be copied, indexed, or archived by third parties even after the original source deletes it. AP CSP Cycle 2 privacy questions present multi-stakeholder scenarios involving competing values and ask students to evaluate tradeoffs rather than identify a single correct answer. Recognizing that privacy, security, access, and accuracy can all conflict simultaneously is the key analytical skill at this difficulty level.

📚 Study the Concept First (Optional) Click to expand ▼

Privacy Tradeoffs: Competing Values

Privacy vs. Security

Stronger security often requires more data collection and monitoring, which reduces privacy. Encryption protects user privacy from eavesdroppers but can also hide criminal activity from legitimate law enforcement. These genuine tradeoffs have no universally correct answer.

Privacy vs. Convenience

Many services offer free access in exchange for personal data used to target advertising. Users gain convenience; the company gains behavioral data. Whether this is a fair exchange depends on whether users meaningfully understand and consent to the tradeoff.

Common Trap: Treating privacy questions as having a single objectively correct answer. AP CSP privacy questions often ask students to identify tradeoffs and evaluate competing interests, not to declare one value superior.
Exam Tip: On AP exam privacy questions, identify all stakeholders (user, company, government, public), what each gains, and what each loses under the described scenario. Strong exam answers acknowledge multiple legitimate interests.
Big Idea 5: Impact of Computing
Cycle 2 • Day 52 Practice • Hard Difficulty
Focus: Privacy, PII, & Ethical Computing

Practice Question

A fitness app collects step count, heart rate, sleep data, and GPS location. Its privacy policy states data may be shared with "third-party partners." Which of the following are valid privacy concerns?

I. GPS data over time could reveal a user's home address, workplace, and daily routines.
II. Health data combined with identity information could be used by insurance companies to adjust premiums.
III. The phrase "third-party partners" is intentionally vague and prevents users from giving truly informed consent.

Why This Answer?

All three are valid concerns. I: Continuous GPS tracking reveals sensitive location patterns including home, work, and frequent destinations. II: Aggregated health and location data creates detailed profiles that could enable discriminatory practices by insurers or employers. III: Vague privacy language prevents users from understanding who receives their data, undermining informed consent.

Why Not the Others?

A) Statement II is also a valid privacy concern — data aggregation across categories creates powerful profiles. C) Statements II and III are also valid. D) All three statements identify real privacy risks, not just III.

Common Mistake
Watch Out!

Students underestimate the risk of data aggregation. Individual data points (steps, heart rate, location) may seem harmless alone, but combining them creates a comprehensive profile with significant privacy implications.

AP Exam Tip

On AP CSP, privacy questions often involve data aggregation — combining multiple types of data to create detailed profiles. Consider what different data types reveal when combined, not just individually.

Keep Practicing!

Consistent daily practice is the key to AP CSP success.

AP CSP Resources Get 1-on-1 Help
Back to blog

Leave a comment

Please note, comments need to be approved before they are published.