SCENARIO
Looking back at your first two years as the Director of Personal Information Protection and Compliance for the St. Anne’s Regional Medical Center in Thorn Bay, Ontario, Canada, you see a parade of accomplishments, from developing state-of-the-art simulation based training for employees on privacy protection to establishing an interactive medical records system that is accessible by patients as well as by the medical personnel. Now, however, a question you have put off looms large: how do we manage all the data-not only records produced recently, but those still on-hand from years ago? A data flow diagram generated last year shows multiple servers, databases, and work stations, many of which hold files that have not yet been incorporated into the new records system. While most of this data is encrypted, its persistence may pose security and compliance concerns. The situation is further complicated by several long-term studies being conducted by the medical staff using patient information. Having recently reviewed the major Canadian privacy regulations, you want to make certain that the medical center is observing them.
You recall a recent visit to the Records Storage Section in the basement of the old hospital next to the modern facility, where you noticed paper records sitting in crates labeled by years, medical condition or alphabetically by patient name, while others were in undifferentiated bundles on shelves and on the floor. On the back shelves of the section sat data tapes and old hard drives that were often unlabeled but appeared to be years old. On your way out of the records storage section, you noticed a man leaving whom you did not recognize. He carried a batch of folders under his arm, apparently records he had removed from storage.
You quickly realize that you need a plan of action on the maintenance, secure storage and disposal of data.
Which cryptographic standard would be most appropriate for protecting patient credit card information in the records system at St. Anne’s Regional Medical Center?
Truncating the last octet of an IP address because it is NOT needed is an example of which privacy principle?
Organizations understand there are aggregation risks associated with the way the process their customer’s data. They typically include the details of this aggregation risk in a privacy notice and ask that all customers acknowledge they understand these risks and consent to the processing.
What type of risk response does this notice and consent represent?
Which of the following is the most important action to take prior to collecting personal data directly from a customer?
An organization is launching a new online subscription-based publication. As the service is not aimed at children, users are asked for their date of birth as part of the of the sign-up process. The privacy technologist suggests it may be more appropriate ask if an individual is over 18 rather than requiring they provide a date of birth. What kind of threat is the privacy technologist concerned about?
SCENARIO
Please use the following to answer the next questions:
Your company is launching a new track and trace health app during the outbreak of a virus pandemic in the US. The developers claim the app is based on privacy by design because personal data collected was considered to ensure only necessary data is captured, users are presented with a privacy notice, and they are asked to give consent before data is shared. Users can update their consent after logging into an account, through a dedicated privacy and consent hub. This is accessible through the 'Settings' icon from any app page, then clicking 'My Preferences', and selecting 'Information Sharing and Consent' where the following choices are displayed:
• "I consent to receive notifications and infection alerts";
• "I consent to receive information on additional features or services, and new products";
• "I consent to sharing only my risk result and location information, for exposure and contact tracing purposes";
• "I consent to share my data for medical research purposes"; and
• "I consent to share my data with healthcare providers affiliated to the company".
For each choice, an ON* or OFF tab is available The default setting is ON for all
Users purchase a virus screening service for USS29 99 for themselves or others using the app The virus screening
service works as follows:
• Step 1 A photo of the user's face is taken.
• Step 2 The user measures their temperature and adds the reading in the app
• Step 3 The user is asked to read sentences so that a voice analysis can detect symptoms
• Step 4 The user is asked to answer questions on known symptoms
• Step 5 The user can input information on family members (name date of birth, citizenship, home address, phone number, email and relationship).)
The results are displayed as one of the following risk status "Low. "Medium" or "High" if the user is deemed at "Medium " or "High" risk an alert may be sent to other users and the user is Invited to seek a medical consultation and diagnostic from a healthcare provider.
A user’s risk status also feeds a world map for contact tracing purposes, where users are able to check if they have been or are in dose proximity of an infected person If a user has come in contact with another individual classified as "medium’ or 'high' risk an instant notification also alerts the user of this. The app collects location trails of every user to monitor locations visited by an infected individual Location is collected using the phone's GPS functionary, whether the app is in use or not however, the exact location of the user is "blurred' for privacy reasons Users can only see on the map circles
Which technology is best suited for the contact tracing feature of the app1?
SCENARIO
Please use the following to answer the next questions:
Your company is launching a new track and trace health app during the outbreak of a virus pandemic in the US. The developers claim the app is based on privacy by design because personal data collected was considered to ensure only necessary data is captured, users are presented with a privacy notice, and they are asked to give consent before data is shared. Users can update their consent after logging into an account, through a dedicated privacy and consent hub. This is accessible through the 'Settings' icon from any app page, then clicking 'My Preferences', and selecting 'Information Sharing and Consent' where the following choices are displayed:
• "I consent to receive notifications and infection alerts";
• "I consent to receive information on additional features or services, and new products";
• "I consent to sharing only my risk result and location information, for exposure and contact tracing purposes";
• "I consent to share my data for medical research purposes"; and
• "I consent to share my data with healthcare providers affiliated to the company".
For each choice, an ON* or OFF tab is available The default setting is ON for all
Users purchase a virus screening service for USS29 99 for themselves or others using the app The virus screening
service works as follows:
• Step 1 A photo of the user's face is taken.
• Step 2 The user measures their temperature and adds the reading in the app
• Step 3 The user is asked to read sentences so that a voice analysis can detect symptoms
• Step 4 The user is asked to answer questions on known symptoms
• Step 5 The user can input information on family members (name date of birth, citizenship, home address, phone number, email and relationship).)
The results are displayed as one of the following risk status "Low. "Medium" or "High" if the user is deemed at "Medium " or "High" risk an alert may be sent to other users and the user is Invited to seek a medical consultation and diagnostic from a healthcare provider.
A user’s risk status also feeds a world map for contact tracing purposes, where users are able to check if they have been or are in dose proximity of an infected person If a user has come in contact with another individual classified as "medium’ or 'high' risk an instant notification also alerts the user of this. The app collects location trails of every user to monitor locations visited by an infected individual Location is collected using the phone's GPS functionary, whether the app is in use or not however, the exact location of the user is "blurred' for privacy reasons Users can only see on the map circles
Which of the following pieces of information collected is the LEAST likely to be justified tor the purposes of the app?
How can a hacker gain control of a smartphone to perform remote audio and video surveillance?
What is the distinguishing feature of asymmetric encryption?
Which of the following is NOT a step in the methodology of a privacy risk framework?
What is the term for information provided to a social network by a member?
Which of the following would be an example of an "objective" privacy harm to an individual?
An organization is launching a new smart speaker to the market. The device will have the capability to play music and provide news and weather updates. Which of the following would be a concern from a privacy perspective?
SCENARIO
Please use the following to answer the next questions:
Your company is launching a new track and trace health app during the outbreak of a virus pandemic in the US. The developers claim the app is based on privacy by design because personal data collected was considered to ensure only necessary data is captured, users are presented with a privacy notice, and they are asked to give consent before data is shared. Users can update their consent after logging into an account, through a dedicated privacy and consent hub. This is accessible through the 'Settings' icon from any app page, then clicking 'My Preferences', and selecting 'Information Sharing and Consent' where the following choices are displayed:
• "I consent to receive notifications and infection alerts";
• "I consent to receive information on additional features or services, and new products";
• "I consent to sharing only my risk result and location information, for exposure and contact tracing purposes";
• "I consent to share my data for medical research purposes"; and
• "I consent to share my data with healthcare providers affiliated to the company".
For each choice, an ON* or OFF tab is available The default setting is ON for all
Users purchase a virus screening service for USS29 99 for themselves or others using the app The virus screening
service works as follows:
• Step 1 A photo of the user's face is taken.
• Step 2 The user measures their temperature and adds the reading in the app
• Step 3 The user is asked to read sentences so that a voice analysis can detect symptoms
• Step 4 The user is asked to answer questions on known symptoms
• Step 5 The user can input information on family members (name date of birth, citizenship, home address, phone number, email and relationship).)
The results are displayed as one of the following risk status "Low. "Medium" or "High" if the user is deemed at "Medium " or "High" risk an alert may be sent to other users and the user is Invited to seek a medical consultation and diagnostic from a healthcare provider.
A user’s risk status also feeds a world map for contact tracing purposes, where users are able to check if they have been or are in dose proximity of an infected person If a user has come in contact with another individual classified as "medium’ or 'high' risk an instant notification also alerts the user of this. The app collects location trails of every user to monitor locations visited by an infected individual Location is collected using the phone's GPS functionary, whether the app is in use or not however, the exact location of the user is "blurred' for privacy reasons Users can only see on the map circles
The location data collected and displayed on the map should be changed for which of the following reasons?
What privacy risk is NOT mitigated by the use of encrypted computation to target and serve online ads?
Revocation and reissuing of compromised credentials is impossible for which of the following authentication techniques?
SCENARIO
Wesley Energy has finally made its move, acquiring the venerable oil and gas exploration firm Lancelot from its long-time owner David Wilson. As a member of the transition team, you have come to realize that Wilson's quirky nature affected even Lancelot's data practices, which are maddeningly inconsistent. “The old man hired and fired IT people like he was changing his necktie,” one of Wilson’s seasoned lieutenants tells you, as you identify the traces of initiatives left half complete.
For instance, while some proprietary data and personal information on clients and employees is encrypted, other sensitive information, including health information from surveillance testing of employees for toxic exposures, remains unencrypted, particularly when included within longer records with less-sensitive data. You also find that data is scattered across applications, servers and facilities in a manner that at first glance seems almost random.
Among your preliminary findings of the condition of data at Lancelot are the following:
Cloud technology is supplied by vendors around the world, including firms that you have not heard of. You are told by a former Lancelot employee that these vendors operate with divergent security requirements and protocols.
The company’s proprietary recovery process for shale oil is stored on servers among a variety of less-sensitive information that can be accessed not only by scientists, but by personnel of all types at most company locations.
DES is the strongest encryption algorithm currently used for any file.
Several company facilities lack physical security controls, beyond visitor check-in, which familiar vendors often bypass.
Fixing all of this will take work, but first you need to grasp the scope of the mess and formulate a plan of action to address it.
Which procedure should be employed to identify the types and locations of data held by Wesley Energy?
Which activity best supports the principle of data quality from a privacy perspective?
Which of the following CANNOT be effectively determined during a code audit?
SCENARIO
Please use the following to answer the next question:
Jordan just joined a fitness-tracker start-up based in California, USA, as its first Information Privacy and Security Officer. The company is quickly growing its business but does not sell any of the fitness trackers itself. Instead, it relies on a distribution network of third-party retailers in all major countries. Despite not having any stores, the company has a 78% market share in the EU. It has a website presenting the company and products, and a member section where customers can access their information. Only the email address and physical address need to be provided as part of the registration process in order to customize the site to the user’s region and country. There is also a newsletter sent every month to all members featuring fitness tips, nutrition advice, product spotlights from partner companies based on user behavior and preferences.
Jordan says the General Data Protection Regulation (GDPR) does not apply to the company. He says the company is not established in the EU, nor does it have a processor in the region. Furthermore, it does not do any “offering goods or services” in the EU since it does not do any marketing there, nor sell to consumers directly. Jordan argues that it is the customers who chose to buy the products on their own initiative and there is no “offering” from the company.
The fitness trackers incorporate advanced features such as sleep tracking, GPS tracking, heart rate monitoring. wireless syncing, calorie-counting and step-tracking. The watch must be paired with either a smartphone or a computer in order to collect data on sleep levels, heart rates, etc. All information from the device must be sent to the company’s servers in order to be processed, and then the results are sent to the smartphone or computer. Jordan argues that there is no personal information involved since the company does not collect banking or social security information.
Why is Jordan’s claim that the company does not collect personal information as identified by the GDPR inaccurate?
Which of the following statements is true regarding software notifications and agreements?
A credit card with the last few numbers visible is an example of what?
SCENARIO
WebTracker Limited is a cloud-based online marketing service located in London. Last year, WebTracker migrated its IT infrastructure to the cloud provider AmaZure, which provides SQL Databases and Artificial Intelligence services to WebTracker. The roles and responsibilities between the two companies have been formalized in a standard contract, which includes allocating the role of data controller to WebTracker.
The CEO of WebTracker, Mr. Bond, would like to assess the effectiveness of AmaZure's privacy controls, and he recently decided to hire you as an independent auditor. The scope of the engagement is limited only to the marketing services provided by WebTracker, you will not be evaluating any internal data processing activity, such as HR or Payroll.
This ad-hoc audit was triggered due to a future partnership between WebTracker and SmartHome — a partnership that will not require any data sharing. SmartHome is based in the USA, and most recently has dedicated substantial resources to developing smart refrigerators that can suggest the recommended daily calorie intake based on DNA information. This and other personal data is collected by WebTracker.
To get an idea of the scope of work involved, you have decided to start reviewing the company's documentation and interviewing key staff to understand potential privacy risks.
The results of this initial work include the following notes:
There are several typos in the current privacy notice of WebTracker, and you were not able to find the privacy notice for SmartHome.
You were unable to identify all the sub-processors working for SmartHome. No subcontractor is indicated in the cloud agreement with AmaZure, which is responsible for the support and maintenance of the cloud infrastructure.
There are data flows representing personal data being collected from the internal employees of WebTracker, including an interface from the HR system.
Part of the DNA data collected by WebTracker was from employees, as this was a prototype approved by the CEO of WebTracker.
All the WebTracker and SmartHome customers are based in USA and Canada.
Which of the following issues is most likely to require an investigation by the Chief Privacy Officer (CPO) of WebTracker?
Which of the following would be an example of an "objective" privacy harm to an individual, based on Calo's Harm Dimensions?
Which is NOT a way to validate a person's identity?
In the realm of artificial intelligence, how has deep learning enabled greater implementation of machine learning?
What is the main privacy threat posed by Radio Frequency Identification (RFID)?
When writing security policies, the most important consideration is to?
What logs should an application server retain in order to prevent phishing attacks while minimizing data retention?
Which of the following activities would be considered the best method for an organization to achieve the privacy principle of data quality'?
Combining multiple pieces of information about an individual to produce a whole that is greater than the sum of its parts is called?
Which of the following suggests the greatest degree of transparency?
What risk is mitigated when routing meeting video traffic through a company’s application servers rather than sending the video traffic directly from one user to another?
A privacy engineer reviews a newly developed on-line registration page on a company’s website. The purpose of the page is to enable corporate customers to submit a returns / refund request for physical goods. The page displays the following data capture fields: company name, account reference, company address, contact name, email address, contact phone number, product name, quantity, issue description and company bank account details.
After her review, the privacy engineer recommends setting certain capture fields as “non-mandatory”. Setting which of the following fields as “non-mandatory” would be the best example of the principle of data minimization?
Granting data subjects the right to have data corrected, amended, or deleted describes?
A vendor has been collecting data under an old contract, not aligned with the practices of the organization.
Which is the preferred response?
SCENARIO
It should be the most secure location housing data in all of Europe, if not the world. The Global Finance Data Collective (GFDC) stores financial information and other types of client data from large banks, insurance companies, multinational corporations and governmental agencies. After a long climb on a mountain road that leads only to the facility, you arrive at the security booth. Your credentials are checked and checked again by the guard to visually verify that you are the person pictured on your passport and national identification card. You are led down a long corridor with server rooms on each side, secured by combination locks built into the doors. You climb a flight of stairs and are led into an office that is lighted brilliantly by skylights where the GFDC Director of Security, Dr. Monique Batch, greets you. On the far wall you notice a bank of video screens showing different rooms in the facility. At the far end, several screens show different sections of the road up the mountain
Dr. Batch explains once again your mission. As a data security auditor and consultant, it is a dream assignment: The GFDC does not want simply adequate controls, but the best and most effective security that current technologies allow.
“We were hacked twice last year,” Dr. Batch says, “and although only a small number of records were stolen, the bad press impacted our business. Our clients count on us to provide security that is nothing short of impenetrable and to do so quietly. We hope to never make the news again.” She notes that it is also essential that the facility is in compliance with all relevant security regulations and standards.
You have been asked to verify compliance as well as to evaluate all current security controls and security measures, including data encryption methods, authentication controls and the safest methods for transferring data into and out of the facility. As you prepare to begin your analysis, you find yourself considering an intriguing question: Can these people be sure that I am who I say I am?
You are shown to the office made available to you and are provided with system login information, including the name of the wireless network and a wireless key. Still pondering, you attempt to pull up the facility's wireless network, but no networks appear in the wireless list. When you search for the wireless network by name, however it is readily found.
What measures can protect client information stored at GFDC?
Which of the following statements describes an acceptable disclosure practice?
What element is most conducive to fostering a sound privacy by design culture in an organization?
Implementation of privacy controls for compliance with the requirements of the Children’s Online Privacy Protection Act (COPPA) is necessary for all the following situations EXCEPT?
When analyzing user data, how is differential privacy applied?
During a transport layer security (TLS) session, what happens immediately after the web browser creates a random PreMasterSecret?
SCENARIO
Please use the following to answer the next question:
Chuck, a compliance auditor for a consulting firm focusing on healthcare clients, was required to travel to the client’s office to perform an onsite review of the client’s operations. He rented a car from Finley Motors upon arrival at the airport as so he could commute to and from the client’s office. The car rental agreement was electronically signed by Chuck and included his name, address, driver’s license, make/model of the car, billing rate, and additional details describing the rental transaction. On the second night, Chuck was caught by a red light camera not stopping at an intersection on his way to dinner. Chuck returned the car back to the car rental agency at the end week without mentioning the infraction and Finley Motors emailed a copy of the final receipt to the address on file.
Local law enforcement later reviewed the red light camera footage. As Finley Motors is the registered owner of the car, a notice was sent to them indicating the infraction and fine incurred. This notice included the license plate number, occurrence date and time, a photograph of the driver, and a web portal link to a video clip of the violation for further review. Finley Motors, however, was not responsible for the violation as they were not driving the car at the time and transferred the incident to AMP Payment Resources for further review. AMP Payment Resources identified Chuck as the driver based on the rental agreement he signed when picking up the car and then contacted Chuck directly through a written letter regarding the infraction to collect the fine.
After reviewing the incident through the AMP Payment Resources’ web portal, Chuck paid the fine using his personal credit card. Two weeks later, Finley Motors sent Chuck an email promotion offering 10% off a future rental.
How can Finley Motors reduce the risk associated with transferring Chuck’s personal information to AMP Payment Resources?
SCENARIO
Wesley Energy has finally made its move, acquiring the venerable oil and gas exploration firm Lancelot from its long-time owner David Wilson. As a member of the transition team, you have come to realize that Wilson's quirky nature affected even Lancelot's data practices, which are maddeningly inconsistent. “The old man hired and fired IT people like he was changing his necktie,” one of Wilson’s seasoned lieutenants tells you, as you identify the traces of initiatives left half complete.
For instance, while some proprietary data and personal information on clients and employees is encrypted, other sensitive information, including health information from surveillance testing of employees for toxic exposures, remains unencrypted, particularly when included within longer records with less-sensitive data. You also find that data is scattered across applications, servers and facilities in a manner that at first glance seems almost random.
Among your preliminary findings of the condition of data at Lancelot are the following:
Cloud technology is supplied by vendors around the world, including firms that you have not heard of. You are told by a former Lancelot employee that these vendors operate with divergent security requirements and protocols.
The company’s proprietary recovery process for shale oil is stored on servers among a variety of less-sensitive information that can be accessed not only by scientists, but by personnel of all types at most company locations.
DES is the strongest encryption algorithm currently used for any file.
Several company facilities lack physical security controls, beyond visitor check-in, which familiar vendors often bypass.
Fixing all of this will take work, but first you need to grasp the scope of the mess and formulate a plan of action to address it.
Which is true regarding the type of encryption Lancelot uses?
What was the first privacy framework to be developed?
SCENARIO
WebTracker Limited is a cloud-based online marketing service located in London. Last year, WebTracker migrated its IT infrastructure to the cloud provider AmaZure, which provides SQL Databases and Artificial Intelligence services to WebTracker. The roles and responsibilities between the two companies have been formalized in a standard contract, which includes allocating the role of data controller to WebTracker.
The CEO of WebTracker, Mr. Bond, would like to assess the effectiveness of AmaZure's privacy controls, and he recently decided to hire you as an independent auditor. The scope of the engagement is limited only to the marketing services provided by WebTracker, you will not be evaluating any internal data processing activity, such as HR or Payroll.
This ad-hoc audit was triggered due to a future partnership between WebTracker and SmartHome — a partnership that will not require any data sharing. SmartHome is based in the USA, and most recently has dedicated substantial resources to developing smart refrigerators that can suggest the recommended daily calorie intake based on DNA information. This and other personal data is collected by WebTracker.
To get an idea of the scope of work involved, you have decided to start reviewing the company's documentation and interviewing key staff to understand potential privacy risks.
The results of this initial work include the following notes:
There are several typos in the current privacy notice of WebTracker, and you were not able to find the privacy notice for SmartHome.
You were unable to identify all the sub-processors working for SmartHome. No subcontractor is indicated in the cloud agreement with AmaZure, which is responsible for the support and maintenance of the cloud infrastructure.
There are data flows representing personal data being collected from the internal employees of WebTracker, including an interface from the HR system.
Part of the DNA data collected by WebTracker was from employees, as this was a prototype approved by the CEO of WebTracker.
All the WebTracker and SmartHome customers are based in USA and Canada.
Based on the initial assessment and review of the available data flows, which of the following would be the most important privacy risk you should investigate first?
Which of the following methods does NOT contribute to keeping the data confidential?
SCENARIO
Looking back at your first two years as the Director of Personal Information Protection and Compliance for the Berry Country Regional Medical Center in Thorn Bay, Ontario, Canada, you see a parade of accomplishments, from developing state-of-the-art simulation based training for employees on privacy protection to establishing an interactive medical records system that is accessible by patients as well as by the medical personnel. Now, however, a question you have put off looms large: how do we manage all the data-not only records produced recently, but those still on hand from years ago? A data flow diagram generated last year shows multiple servers, databases, and work stations, many of which hold files that have not yet been incorporated into the new records system. While most of this data is encrypted, its persistence may pose security and compliance concerns. The situation is further complicated by several long-term studies being conducted by the medical staff using patient information. Having recently reviewed the major Canadian privacy regulations, you want to make certain that the medical center is observing them.
You also recall a recent visit to the Records Storage Section, often termed “The Dungeon” in the basement of the old hospital next to the modern facility, where you noticed a multitude of paper records. Some of these were in crates marked by years, medical condition or alphabetically by patient name, while others were in undifferentiated bundles on shelves and on the floor. The back shelves of the section housed data tapes and old hard drives that were often unlabeled but appeared to be years old. On your way out of the dungeon, you noticed just ahead of you a small man in a lab coat who you did not recognize. He carried a batch of folders under his arm, apparently records he had removed from storage.
Which cryptographic standard would be most appropriate for protecting patient credit card information in the records system?
SCENARIO
It should be the most secure location housing data in all of Europe, if not the world. The Global Finance Data Collective (GFDC) stores financial information and other types of client data from large banks, insurance companies, multinational corporations and governmental agencies. After a long climb on a mountain road that leads only to the facility, you arrive at the security booth. Your credentials are checked and checked again by the guard to visually verify that you are the person pictured on your passport and national identification card. You are led down a long corridor with server rooms on each side, secured by combination locks built into the doors. You climb a flight of stairs and are led into an office that is lighted brilliantly by skylights where the GFDC Director of Security, Dr. Monique Batch, greets you. On the far wall you notice a bank of video screens showing different rooms in the facility. At the far end, several screens show different sections of the road up the mountain
Dr. Batch explains once again your mission. As a data security auditor and consultant, it is a dream assignment: The GFDC does not want simply adequate controls, but the best and most effective security that current technologies allow.
“We were hacked twice last year,” Dr. Batch says, “and although only a small number of records were stolen, the bad press impacted our business. Our clients count on us to provide security that is nothing short of impenetrable and to do so quietly. We hope to never make the news again.” She notes that it is also essential that the facility is in compliance with all relevant security regulations and standards.
You have been asked to verify compliance as well as to evaluate all current security controls and security measures, including data encryption methods, authentication controls and the safest methods for transferring data into and out of the facility. As you prepare to begin your analysis, you find yourself considering an intriguing question: Can these people be sure that I am who I say I am?
You are shown to the office made available to you and are provided with system login information, including the name of the wireless network and a wireless key. Still pondering, you attempt to pull up the facility's wireless network, but no networks appear in the wireless list. When you search for the wireless network by name, however it is readily found.
Why would you recommend that GFC use record encryption rather than disk, file or table encryption?
Which of the following is a privacy consideration for NOT sending large-scale SPAM type emails to a database of email addresses?
What is a main benefit of data aggregation?
Which of the following would be the best method of ensuring that Information Technology projects follow Privacy by Design (PbD) principles?
Which of the following is NOT a workplace surveillance best practice?
A key principle of an effective privacy policy is that it should be?
Which of the following does NOT illustrate the ‘respect to user privacy’ principle?
Which of the following is an example of the privacy risks associated with the Internet of Things (loT)?
Which activity would best support the principle of data quality?
Which concept related to privacy choice is demonstrated by highlighting and bolding the "accept" button on a cookies notice while maintaining standard text format for other options?
Which is NOT a suitable action to apply to data when the retention period ends?
Which of the following best describes the basic concept of "Privacy by Design?"
An organization is launching a smart watch which, in addition to alerts, will notify the the wearer of incoming calls allowing them to answer on the device. This convenience also comes with privacy concerns and is an example of?
Which privacy engineering objective proposed by the US National Institute of Science and Technology (NIST) decreases privacy risk by ensuring that connections between individuals and their personal data are reduced?
A clinical research organization is processing highly sensitive personal data, including numerical attributes, from medical trial results. The organization needs to manipulate the data without revealing the contents to data users. This can be achieved by utilizing?
SCENARIO
Please use the following to answer the next question:
Chuck, a compliance auditor for a consulting firm focusing on healthcare clients, was required to travel to the client’s office to perform an onsite review of the client’s operations. He rented a car from Finley Motors upon arrival at the airport as so he could commute to and from the client’s office. The car rental agreement was electronically signed by Chuck and included his name, address, driver’s license, make/model of the car, billing rate, and additional details describing the rental transaction. On the second night, Chuck was caught by a red light camera not stopping at an intersection on his way to dinner. Chuck returned the car back to the car rental agency at the end week without mentioning the infraction and Finley Motors emailed a copy of the final receipt to the address on file.
Local law enforcement later reviewed the red light camera footage. As Finley Motors is the registered owner of the car, a notice was sent to them indicating the infraction and fine incurred. This notice included the license plate number, occurrence date and time, a photograph of the driver, and a web portal link to a video clip of the violation for further review. Finley Motors, however, was not responsible for the violation as they were not driving the car at the time and transferred the incident to AMP Payment Resources for further review. AMP Payment Resources identified Chuck as the driver based on the rental agreement he signed when picking up the car and then contacted Chuck directly through a written letter regarding the infraction to collect the fine.
After reviewing the incident through the AMP Payment Resources’ web portal, Chuck paid the fine using his personal credit card. Two weeks later, Finley Motors sent Chuck an email promotion offering 10% off a future rental.
What is the most secure method Finley Motors should use to transmit Chuck’s information to AMP Payment Resources?
An organization is reliant on temporary contractors for performing data analytics and they require access to personal data via software-as-a-service to perform their job. When the temporary contractor completes their work assignment, what woul^.be the most effective way to safeguard privacy and access to personal data when they leave?
Which technique is most likely to facilitate the deletion of every instance of data associated with a deleted user account from every data store held by an organization?