황순정 구강악안면외과 치과의원
HSJ Oral and Maxillofacial Surgery Clinic
행복을 주는 치료
진료고객을 위한 정성과 만족스러운 치료 결과

Safety Check in iOS 16 Puts Abuse Survivors Back in Control

페이지 정보

작성자 Renee Vaux 댓글 0건 조회 313회 작성일 22-09-22 21:57

본문

id="article-body" class="row" section="article-body" data-component="trackCWV">





























What's happening
Apple's announced a new Safety Check feature to help potential victims in abusive relationships.

Why it matters
This is the latest example of the tech industry taking on tough personal technology issues that don't have clear or easy answers.


What's next
Apple is communicating with victim-survivor advocacy organizations to identify other features that can help people in crisis.





Among the long-requested and popular new features Apple plans to bring to the iPhone this fall, like as well as a function to find and , is one that isn't just a convenience -- using it could mean life or death.
On Monday, Apple announced Safety Check, , designed to aid domestic violence victims. The setting, coming this fall with iOS 16, is designed to help someone quickly cut ties with a potential abuser. Safety Check does this by helping a person quickly see with whom they're automatically sharing sensitive info like their location or photos. But in an emergency, it also lets a person simply and quickly disable access and information sharing to every device other than the one in their hands.

Notably, the app also includes a prominent button at the top right of the screen, labeled Quick Exit. As the name implies, it's designed to help a potential victim quickly hide that they'd been looking at Safety Check, bokep indonesia in case their abuser doesn't allow them privacy. If the abuser reopens the settings app, where Safety Check is kept, it'll start at the default general settings page, effectively covering up the victim's tracks.




















"Many people share passwords and access to their devices with a partner," Katie Skinner, a privacy engineering manager at Apple, said at the company's WWDC event Monday. "However, in abusive relationships, this can threaten personal safety and make it harder for victims to get help."

Safety Check, and the careful way in which it was coded, are part of a larger effort among tech companies to stop their products from being used as tools of abuse. It's also the latest sign of Apple's willingness to wade into building technology to tackle sensitive topics. And though the company says it's earnest in its approach, it's drawn criticism for some of its moves. Last year, the company announced efforts to detect child exploitation imagery on some of its phones, tablets and computers, a move that critics worried . 

















Still, victim advocates say Apple's one of the few large companies publicly working on these issues. While many tech giants including Microsoft, Facebook, Twitter and Google have built and implemented systems and behavior on their respective sites, they've struggled to build tools that stop abuse as it's happening.

Unfortunately, the abuse has gotten worse. A survey of practitioners who work on domestic violence conducted in November 2020 found that 99.3% had clients who had experienced "," according to the , which worked on the report with Curtin University in Australia. Moreover, the organizations learned that reports of tracking of victims had jumped more than 244% since they last conducted the survey in 2015. 

Amid all this, tech companies like Apple have increasingly worked working with victim organizations to understand how their tools can be both misused by a perpetrator and helpful to a potential victim. The result are features, like Safety Check's Quick Exit button, that advocates say are a sign Apple's building these features in what they call a "trauma-informed" way.

"Most people cannot appreciate the sense of urgency" many victims have, said , executive director of the National Center for Victims of Crime. "Apple's been very receptive."

Apple says there are more than a billion iPhones being used around the world.

Apple/Screenshot by CNET
Tough issues
Some of the tech industry's biggest wins have come from identifying abusers. In 2009, Microsoft helped create image recognition software called PhotoDNA, which is now used by social networks and websites around the world to when it's uploaded to the internet. Similar programs have since been built to help identify known , livestreams of  and other things that large tech companies try to keep off their platforms.

As tech has become more pervasive in our lives, these efforts have taken on increased importance. And unlike adding a new video technology or increasing a computer's performance, these social issues don't always have clear answers.

In 2021, Apple made one of its first public moves into victim-focused technology when it announced new features for its iMessage service designed to analyze messages sent to users marked as children to . If its system suspected an image, it would blur the attachment and warn the person receiving it to make sure they'd wanted to see it. Apple's service would also point children to resources that could help them if they're being victimized through the service.

At the time, Apple said it built the message-scanning technology with privacy in mind. But activists worried Apple's system was also designed to alert an identified parent if their child chose to view the suspected attached image anyway. That, some critics said, could incite abuse from a potentially dangerous parent.

















Apple's additional efforts to detect potential child abuse images that might be synchronized to its photo service through iPhones, iPads and Mac computers was criticized by security experts who .

Still, victim advocates acknowledged that Apple was one of the few device companies working on tools meant to support victims of potential abuse as it's happening. Microsoft and Google didn't respond to requests for comment about whether they plan to introduce features akin to Safety Check to help victims who might be using Windows and Xbox software for PCs and video game consoles, or Android mobile software for phones and tablets.

Apple introduced a system for child safety in iMessages last year.

Apple
Learning, but much to do
The tech industry has been working with victims organizations for over a decade, seeking ways to adopt safety mindsets within their products. Advocates say that in the past few years in particular, many within the tech giants, staffed in some cases with people from the nonprofit world who worked on the issues the tech industry was taking on. 

Apple started consulting with some victims rights advocates about Safety Check last year, asking for input and ideas for how to best build the system. 

"We are starting to see recognition that there is a corporate or social responsibility to ensure your apps can't be too simply misused," Karen Bentley, . And she said that's particularly tough because, as technology has evolved to become easier to use, so has the potential for it to be a tool of abuse.

That's part of why she says Apple's Safety Check is "brilliant," because it can quickly and easily separate someone's digital information and communications from their abuser. "If you're experiencing domestic violence you're likely to be experiencing some of that violence in technology," she said.

Though Safety Check has moved from an idea into test software and will be made widely available with the iOS 16 suite of software updates for iPhones and iPads in the fall, Apple said it plans more work on these issues. 

Unfortunately, Safety Check doesn't extend to ways abusers might be tracking people using devices they don't own -- such as if someone slips one of Apple's $29 AirTag trackers into their coat pocket or onto their car . Safety Check also isn't designed for phones set up under child accounts, for people under the age of 13, though the feature's still in testing and could change.

















"Unfortunately, abusers are persistent and are constantly updating their tactics," said Erica Olsen, project director for , a program from the National Network to End Domestic Violence that trains companies, community groups and governments on how to improve victim safety and privacy. "There will always be more to do in this space."

Apple said it's expanding training with its employees who interact with customers, including sales people in its stores, to know how features like Safety Check work and be able to teach it when appropriate. The company has also created guidelines for its support staff to help identify and help potential victims.

In one instance, for example, AppleCare teams are being taught to listen for when an iPhone owner calls expressing concern that they don't have control over their own device or their own iCloud account. In another, AppleCare can guide someone on how to remove their Apple ID from a family group.

Apple also updated its  in January to instruct people how to reset and regain control of an iCloud account that might be compromised or being used as a tool for abuse.

Craig Federighi, Apple's head of software engineering, said the company will continue expanding its personal safety features as part of its larger commitment to its customers. "Protecting you and your privacy is, and will always be, the center of what we do," he said.






댓글목록

등록된 댓글이 없습니다.

진료안내
02-595-4737
  • 평일 : 오전 09:30~오후 6:30
  • 토요일 : 오전 09:30~오후 3:00
  • 점심시간 : 오후 1:00~오후 2:00
  • 목요일/일요일・공휴일 휴진
고객센터
성함, 진료내용 (예: 양악수술, 임플란트, 안면윤곽수술), 진료 가능 요일/시간대를 알려주시면 예약안내드립니다
찾아오시는길
2호선 강남역 5번출구에서 도보 3분
서울시 서초구 강남대로 349 우남빌딩
황순정 구강악안면외과 치과의원 · 569-32-00723 · 황순정 서울특별시 서초구 강남대로 349 2층, 3층 TEL : 02-595-4737 | FAX : 02-525-4738 Copyright © hsj-dental.co.kr All rights reserved.
예약
문의