e dot dot dot
a mostly about the Internet blog by

September 2019
Sun Mon Tue Wed Thu Fri Sat
         


Rep. Mark Takano Introduces Bill That Would Keep Companies From Blocking Defendants' Access To Evidence

Furnished content.


When the government doesn't want to talk about its law enforcement tech, it dismisses cases. The FBI has done this on several occasions. First, it told local law enforcement to dismiss cases rather than discuss Stingray use in court. Then it did the same thing with its homegrown malware in child porn cases.But the government can't do everything itself. It purchases software and outsources forensic investigation. All well and good except when it comes to prosecutions. Defendants have a right to access the evidence being used against them. But in court cases where third-party tech is in play, private companies are inserting themselves into the proceedings to demand the courts protect their "trade secrets."Obviously, this makes a mockery of the adversarial system. If defendants can't challenge the evidence being used against them, the government will be encouraged to stack the deck in its favor by offshoring as much of its forensic and investigative work as possible.Fortunately, someone is actually trying to do something about this. Rep. Mark Takano (California) is introducing a bill that would prevent tech companies from helping the federal government screw criminal defendants out of their Constitutional rights.Takano's Justice in Forensic Algorithms Act of 2019 was introduced with this rather clever tweet, featuring a bit of pseudo-coding to drive the point home.

If the government is using third-party tech to prosecute citizens, citizens shouldn't be denied access to information just because some company thinks any examination at all might undercut its market advantage.
“The trade secrets privileges of software developers should never trump the due process rights of defendants in the criminal justice system,” said Rep. Mark Takano. “Our criminal justice system is an adversarial system. As part of this adversarial system, defendants are entitled to confront and challenge any evidence used against them. As technological innovations enter our criminal justice system, we need to ensure that they don’t undermine these critical rights. Forensic algorithms are black boxes, and we need to be able to look inside to understand how the software works and to give defendants the ability to challenge them. My legislation will open the black box of forensic algorithms and establish standards that will safeguard our Constitutional right to a fair trial.”
Congress can't force the court to side with defendants in cases where access to third-party software is at stake. But it can prevent companies from invoking trade secret privileges to prevent defendants from accessing evidence. The bill goes further than just blocking trade privilege interjections. It also would create a national standard for forensic algorithms to ensure they are robust and fair. And that they actually do what they say they do.This process could bring a bit more science to a field that's been mostly mumbo and/or jumbo. And it won't allow law enforcement to create their own forensic black boxes to replace the ones they used to purchase from third parties. It will require input from a number of parties not in the law enforcement profession, ensuring this won't end up being another half-assed effort that shores up the government's belief that all accused parties are guilty until proven guilty.
Directs NIST to establish Computational Forensic Algorithms Standards and a Computational Forensic Algorithms Testing Program and requires federal law enforcement to comply with these standards and testing requirements in their use of forensic algorithms. In developing standards NIST is directed to:- collaborate with outside experts in forensic science, bioethics, algorithmic discrimination, data privacy, racial justice, criminal justice reform, exonerations, and other relevant areas of expertise identified through public input;- address the potential for disparate impact across protected classes in standards and testing; and- gather public input for the development of the standards and testing program and publicly document the resulting standards and testing of software.
This part could take awhile to get up and running. But it's far better than the system currently being used, which has allowed the government's expert forensic witnesses to overstate the certainty of their findings for years on end.The more immediate effect will be the constraints placed on private companies who wish to intercede in criminal cases. The government -- working with its vendors -- will be obligated to provide defendants with a report on the software used, an executable version of the software itself, and its source code. If companies are worried their trade secrets might be exposed in criminal cases, they might want to rethink their partnerships and decide whether the tradeoffs they have to make in court to continue doing business with the government are worth it.

Permalink | Comments | Email This Story


Read more here

posted at: 12:00am on 26-Sep-2019
path: /Policy | permalink | edit (requires password)

0 comments, click here to add the first



Chinese Authorities Call For Internet Companies To Add Bias To AI Algorithms -- In Order To 'Promote Mainstream Values'

Furnished content.


Techdirt has been tracking the worsening online surveillance and censorship situation in China for many years now. The latest move concerns the currently hot area of artificial intelligence (AI). It's a sector that the Chinese government understands better than most Western governments, and which it has made one of its technology priorities. The authorities in China know that AI in the form of algorithms is increasingly deployed to optimize and customize Web sites. They have realized that this fact gives them an important new lever for controlling the online world. As South China Morning Post reports, the Cyberspace Administration of China has released its draft regulations on "managing the cyberspace ecosystem", which include the following:

The regulations state that information providers on all manner of platforms -- from news and social media sites, to gaming and e-commerce -- should strengthen the management of recommendation lists, trending topics, "hot search" lists and push notifications."Online information providers that use algorithms to push customised information [to users] should build recommendation systems that promote mainstream values, and establish mechanisms for manual intervention and override," it said.
"Mainstream values" include resources that promote Xi Jinping's writings; party policies and socialist core values; information that displays China's economic and social development; and anything else which helps promote Chinese culture and stability. By contrast, "harmful information" is stuff that is "sexually suggestive, promotes extravagant lifestyles, flaunts wealth or hypes celebrity gossip and scandals."As is increasingly the case, China is in the vanguard of digital culture here. The rest of the world is beginning to wake up to the serious threat of bias as AI-powered algorithms are deployed more widely. China has moved beyond that stage and is now actively weaponizing bias to push a government agenda. This is a useful warning to those who see algorithmic decision-making as the solution to hard problems.For example, it is clear that the only way that the EU Copyright Directive's upload filters can be implemented is through automated filters using AI. As China's latest move makes clear, once those filters are in place on major Internet sites in the EU, it would be easy for governments to require that the software should be tweaked to introduce a little bias -- to protect the children, or society, or whatever. Those who are horrified by what the Chinese authorities are proposing would do well to start arguing for safeguards to stop the same path being taken outside that country.Follow me @glynmoody on Twitter, Diaspora, or Mastodon.

Permalink | Comments | Email This Story


Read more here

posted at: 12:00am on 26-Sep-2019
path: /Policy | permalink | edit (requires password)

0 comments, click here to add the first



September 2019
Sun Mon Tue Wed Thu Fri Sat
         







RSS (site)  RSS (path)

ATOM (site)  ATOM (path)

Categories
 - blog home

 - Announcements  (0)
 - Annoyances  (0)
 - Career_Advice  (0)
 - Domains  (0)
 - Downloads  (3)
 - Ecommerce  (0)
 - Fitness  (0)
 - Home_and_Garden  (0)
     - Cooking  (0)
     - Tools  (0)
 - Humor  (0)
 - Notices  (0)
 - Observations  (1)
 - Oddities  (2)
 - Online_Marketing  (0)
     - Affiliates  (1)
     - Merchants  (1)
 - Policy  (3743)
 - Programming  (0)
     - Bookmarklets  (1)
     - Browsers  (1)
     - DHTML  (0)
     - Javascript  (3)
     - PHP  (0)
     - PayPal  (1)
     - Perl  (37)
          - blosxom  (0)
     - Unidata_Universe  (22)
 - Random_Advice  (1)
 - Reading  (0)
     - Books  (0)
     - Ebooks  (0)
     - Magazines  (0)
     - Online_Articles  (5)
 - Resume_or_CV  (1)
 - Reviews  (2)
 - Rhode_Island_USA  (0)
     - Providence  (1)
 - Shop  (0)
 - Sports  (0)
     - Football  (0)
          - Cowboys  (0)
          - Patriots  (0)
     - Futbol  (0)
          - The_Rest  (0)
          - USA  (0)
 - Technology  (1167)
 - Windows  (1)
 - Woodworking  (0)


Archives
 -2024  April  (103)
 -2024  March  (179)
 -2024  February  (168)
 -2024  January  (146)
 -2023  December  (140)
 -2023  November  (174)
 -2023  October  (156)
 -2023  September  (161)
 -2023  August  (49)
 -2023  July  (40)
 -2023  June  (44)
 -2023  May  (45)
 -2023  April  (45)
 -2023  March  (53)


My Sites

 - Millennium3Publishing.com

 - SponsorWorks.net

 - ListBug.com

 - TextEx.net

 - FindAdsHere.com

 - VisitLater.com