Template:Science and technology: Difference between revisions

From BoyWiki
No edit summary
No edit summary
 
(44 intermediate revisions by the same user not shown)
Line 1: Line 1:
<noinclude>{{ambox | image = [[File:Information icon.png|50px]]| border = blue | type = This is a News template.To see the page it relates to go to: [[Portal:Boylove News Channel]]  }} </noinclude>
<noinclude>{{ambox | image = [[File:Information icon.png|50px]]| border = blue | text = This is a News template.To see the page it relates to go to: [[Portal:Boylove News Channel/Science, Technology, Health]]  }} </noinclude>
{{Box-header
| title= Science and technology News
| editpage= Template:Science and technology
| titleforeground= black
}}
[[File:Google 2015 logo.png|200 px|center|link=https://www.sfgate.com/business/article/A-Dad-Took-Photos-of-His-Naked-Toddler-for-the-17387906.php]]
*[https://www.sfgate.com/business/article/A-Dad-Took-Photos-of-His-Naked-Toddler-for-the-17387906.php A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal.]
::It was a Friday night in February 2021. His wife called their health care provider to schedule an emergency consultation for the next morning, by video because it was a Saturday and there was a pandemic going on. A nurse said to send photos so the doctor could review them in advance.<br> (Kashmir Hill, SFGATE, August 21, 2022  ) <!-- Added August 22, 2022 -->  


[[File:Eagle-2b.jpg|center|250px|link=https://www.eff.org/deeplinks/2022/02/its-back-senators-want-earn-it-bill-scan-all-online-messages]]
*[https://www.eff.org/deeplinks/2022/02/its-back-senators-want-earn-it-bill-scan-all-online-messages It’s Back: Senators Want EARN IT Bill to Scan All Online Messages]
::People don’t want outsiders reading their private messages —not their physical mail, not their texts, not their DMs, nothing. It’s a clear and obvious point, but one place it doesn’t seem to have reached is the U.S. Senate.<br> (Joe Mullin, EFF, US, February 3, 2022 ) <!-- Added 2-4-22-->


* [[Meta-Analysis Prevalence Pedophilia Hebephilia by Filip Schuster]]
[[File:I-phone.jpg|200px|right|link=https://arstechnica.com/information-technology/2021/08/apple-plans-to-scan-us-iphones-for-child-abuse-imagery/]]
:A meta-analysis of all seven relevant phallometric studies reveals that 22% of normal men show greater or equal sexual arousal to child stimuli (individuals up to 13 years old) than to adult stimuli. Combined results of two of these studies reveal male prevalence rates of about 3% for pedophilia (mostly sexually aroused by prepubescents) and about 16% for hebephilia (mostly sexually aroused by pubescents). Details of these studies are described, and implications of the results for sexual science and society are discussed.
*[https://arstechnica.com/information-technology/2021/08/apple-plans-to-scan-us-iphones-for-child-abuse-imagery/ Apple plans to scan US iPhones for child abuse imagery]
<!-- Added 10-19-14 -->
::Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.<br> (Madhumita Murgia and Tim Bradshaw, Financial Times, August 5, 2021)<!-- Added 8-5-21 -->
 
::'''Update'''
:::[https://www.newsweek.com/apple-backtracks-child-safety-update-following-criticism-over-privacy-concerns-1625885 Apple Backtracks on Child Safety Update Following Criticism Over Privacy Concer]
::::Apple said Friday it plans to delay its "Expanded Protections for Children" iOS update that was announced last month after the company encountered negative feedback over privacy concerns.<br> ( Abbianca Makoni, Newsweek, US, Sep. 3, 2021)<!-- Added 9-3-21 -->
:::::[https://9to5mac.com/2021/10/15/governments-planned-to-misuse-csam-scanning-tech/ Governments planned to misuse CSAM scanning tech even before Apple’s announcement]
::::::Governments were already discussing how to misuse CSAM scanning technology even before Apple announced its plans, say security researchers.The biggest concern raised when Apple said it would scan iPhones for child sexual abuse materials (CSAM) is that there would be spec-creep, with governments insisting the company scan for other types of images, and there now seems good evidence for this … <br> (Ben Lovejoy, 9 To5 Mac, Us, October 15, 2021)<!-- Added 10-23-21 -->


{{box-footer}}
<noinclude>
<noinclude>
[[Category:Portal Boylove News page templates]]
[[Category:Portal Boylove News page templates]]
</noinclude>
</noinclude>

Latest revision as of 21:50, 22 August 2022

Science and technology News

It was a Friday night in February 2021. His wife called their health care provider to schedule an emergency consultation for the next morning, by video because it was a Saturday and there was a pandemic going on. A nurse said to send photos so the doctor could review them in advance.
(Kashmir Hill, SFGATE, August 21, 2022 )
People don’t want outsiders reading their private messages —not their physical mail, not their texts, not their DMs, nothing. It’s a clear and obvious point, but one place it doesn’t seem to have reached is the U.S. Senate.
(Joe Mullin, EFF, US, February 3, 2022 )
Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.
(Madhumita Murgia and Tim Bradshaw, Financial Times, August 5, 2021)
Update
Apple Backtracks on Child Safety Update Following Criticism Over Privacy Concer
Apple said Friday it plans to delay its "Expanded Protections for Children" iOS update that was announced last month after the company encountered negative feedback over privacy concerns.
( Abbianca Makoni, Newsweek, US, Sep. 3, 2021)
Governments planned to misuse CSAM scanning tech even before Apple’s announcement
Governments were already discussing how to misuse CSAM scanning technology even before Apple announced its plans, say security researchers.The biggest concern raised when Apple said it would scan iPhones for child sexual abuse materials (CSAM) is that there would be spec-creep, with governments insisting the company scan for other types of images, and there now seems good evidence for this …
(Ben Lovejoy, 9 To5 Mac, Us, October 15, 2021)