e dot dot dot
a mostly about the Internet blog by

August 2020
Sun Mon Tue Wed Thu Fri Sat
           
         


Confused Critic Of Section 230 Now In Charge Of NTIA

Furnished content.


Multiple experts on Section 230 have pointed out that the NTIA's bizarre petition to the FCC to reinterpret Section 230 of the Communications Decency Act is complete nonsense. Professor Eric Goldman's analysis is quite thorough in ripping the petition to shreds.

Normally we expect a government agency like NTIA to provide an intellectually honest assessment of the pros/cons of its actions and not engage in brazen partisan advocacy. Not any more. This petition reads like an appellate brief that would get a C- in a 1L legal writing course. It demonstrated a poor understanding of the facts, the law, and the policy considerations; and it ignored obvious counterarguments. The petition is not designed to advance the interests of America; it is designed to burn it all down.
As we mentioned, it seemed likely that the petition was written by Adam Candeub, a lawyer who was only hired a few months ago by the NTIA. Readers may recognize Candeub's name because he represented the white nationalist Jared Taylor in a failed lawsuit against Twitter for kicking him off the platform. At the time of the lawsuit, I engaged in an email discussion with Candeub in which he tried to justify his lawsuit, and it included the same sort of nonsense and debunked legal theories we now see in the NTIA petition. In that email exchange, he told me that "Section 230 doesn't help Twitter" because "if an internet firm starts to edit or curate others' comments -- creating its own content, it loses this immunity." That, of course, is incorrect.Indeed, the California courts agreed with me (and basically every other court) in ruling that Section 230 protected Twitter's decision to remove Candeub's client.Of course, in the past couple of years since all of that went down, Candeub has continued his quixotic quest to reimagine Section 230 to say what he wants it to say, rather than what the plain language of the law, and basically every court on record (and the authors of the law) have said that it actually says.And now he'll get to do that as the guy in charge of NTIA. Axios is reporting that Candeub has been promoted to become the acting head of NTIA. Given Candeub's activism on this issue it's an odd role for him, and, as has happened so often in this particular administration, a destruction of historical norms. NTIA has historically been extremely balanced and avoids direct political advocacy type positions. It certainly appears that it will be taking a different approach under Candeub, and that approach includes a blatant misrepresentation of key laws about the internet. Historically, NTIA has been an important agency in protecting the open internet -- but now it should be seen as hostile to such an open internet. And that's disappointing for its legacy.

Read more here

posted at: 12:00am on 18-Aug-2020
path: /Policy | permalink | edit (requires password)

0 comments, click here to add the first



England's Exam Fiasco Shows How Not To Apply Algorithms To Complex Problems With Massive Social Impact

Furnished content.


The disruption caused by COVID-19 has touched most aspects of daily life. Education is obviously no exception, as the heated debates about whether students should return to school demonstrate. But another tricky issue is how school exams should be conducted. Back in May, Techdirt wrote about one approach: online testing, which brings with it its own challenges. Where online testing is not an option, other ways of evaluating students at key points in their educational career need to be found. In the UK, the key test is the GCE Advanced level, or A-level for short, taken in the year when students turn 18. Its grades are crucially important because they form the basis on which most university places are awarded in the UK.Since it was not possible to hold the exams as usual, and online testing was not an option either, the body responsible for running exams in the UK, Ofqual, turned to technology. It came up with an algorithm that could be used to predict a student's grades. The results of this high-tech approach have just been announced in England (other parts of the UK run their exams independently). It has not gone well. Large numbers of students have had their expected grades, as predicted by their teachers, downgraded, sometimes substantially. An analysis from one of the main UK educational associations has found that the downgrading is systematic: "the grades awarded to students this year were lower in all 41 subjects than they were for the average of the previous three years."Even worse, the downgrading turns out to have affected students in poorly performing schools, typically in socially deprived areas, the most, while schools that have historically done well, often in affluent areas, or privately funded, saw their students' grades improve over teachers' predictions. In other words, the algorithm perpetuates inequality, making it harder for brilliant students in poor schools or from deprived backgrounds to go to top universities. A detailed mathematical analysis by Tom SF Haines explains how this fiasco came about:

Let's start with the model used by Ofqual to predict grades (p85 onwards of their 319 page report). Each school submits a list of their students from worst student to best student (it included teacher suggested grades, but they threw those away for larger cohorts). Ofqual then takes the distribution of grades from the previous year, applies a little magic to update them for 2020, and just assigns the students to the grades in rank order. If Ofqual predicts that 40% of the school is getting an A [the top grade] then that's exactly what happens, irrespective of what the teachers thought they were going to get. If Ofqual predicts that 3 students are going to get a U [the bottom grade] then you better hope you're not one of the three lowest rated students.
As this makes clear, the inflexibility of the approach guarantees that there will be many cases of injustice, where bright and hard-working students will be given poor grades simply because they were lower down in the class ranking, or because the school did badly the previous year. Twitter and UK newspapers are currently full of stories of young people whose hopes have been dashed by this effect, as they have now lost the places they had been offered at university, because of these poorer-than-expected grades. The problem is so serious, and the anger expressed by parents of all political affiliations so palpable, that the UK government has been forced to scrap Ofqual's algorithmic approach completely, and will now use the teachers' predicted grades in England. Exactly the same happened in Scotland, which also applied a flawed algorithm, and caused similarly huge anguish to thousands of students, before dropping the idea.The idea of writing algorithms to solve this complex problem is not necessarily wrong. Other solutions -- like using grades predicted by teachers -- have their own issues, including bias and grade inflation. The problems in England arose because people did not think through the real-life consequences for individual students of the algorithm's abstract rules -- even though they were warned of the model's flaws. Haines offers some useful, practical advice on how it should have been done:
The problem is with management: they should have asked for help. Faced with a problem this complex and this important they needed to bring in external checkers. They needed to publish the approach months ago, so it could be widely read and mistakes found. While the fact they published the algorithm at all is to be commended (if possibly a legal requirement due to the GDPR right to an explanation), they didn't go anywhere near far enough. Publishing their implementations of the models used would have allowed even greater scrutiny, including bug hunting.
As Haines points out, last year the UK's Alan Turing Institute published an excellent guide to implementing and using AI ethically and safely (pdf). At its heart lie the FAST Track Principles: fairness, accountability, sustainability and transparency. The fact that Ofqual evidently didn't think to apply them to its exam algorithm means its only gets a U grade for its work on this problem. Must try harder.Follow me @glynmoody on Twitter, Diaspora, or Mastodon.

Read more here

posted at: 12:00am on 18-Aug-2020
path: /Policy | permalink | edit (requires password)

0 comments, click here to add the first



August 2020
Sun Mon Tue Wed Thu Fri Sat
           
         







RSS (site)  RSS (path)

ATOM (site)  ATOM (path)

Categories
 - blog home

 - Announcements  (0)
 - Annoyances  (0)
 - Career_Advice  (0)
 - Domains  (0)
 - Downloads  (3)
 - Ecommerce  (0)
 - Fitness  (0)
 - Home_and_Garden  (0)
     - Cooking  (0)
     - Tools  (0)
 - Humor  (0)
 - Notices  (0)
 - Observations  (1)
 - Oddities  (2)
 - Online_Marketing  (0)
     - Affiliates  (1)
     - Merchants  (1)
 - Policy  (3743)
 - Programming  (0)
     - Bookmarklets  (1)
     - Browsers  (1)
     - DHTML  (0)
     - Javascript  (3)
     - PHP  (0)
     - PayPal  (1)
     - Perl  (37)
          - blosxom  (0)
     - Unidata_Universe  (22)
 - Random_Advice  (1)
 - Reading  (0)
     - Books  (0)
     - Ebooks  (0)
     - Magazines  (0)
     - Online_Articles  (5)
 - Resume_or_CV  (1)
 - Reviews  (2)
 - Rhode_Island_USA  (0)
     - Providence  (1)
 - Shop  (0)
 - Sports  (0)
     - Football  (0)
          - Cowboys  (0)
          - Patriots  (0)
     - Futbol  (0)
          - The_Rest  (0)
          - USA  (0)
 - Technology  (1049)
 - Windows  (1)
 - Woodworking  (0)


Archives
 -2024  March  (164)
 -2024  February  (168)
 -2024  January  (146)
 -2023  December  (140)
 -2023  November  (174)
 -2023  October  (156)
 -2023  September  (161)
 -2023  August  (49)
 -2023  July  (40)
 -2023  June  (44)
 -2023  May  (45)
 -2023  April  (45)
 -2023  March  (53)
 -2023  February  (40)


My Sites

 - Millennium3Publishing.com

 - SponsorWorks.net

 - ListBug.com

 - TextEx.net

 - FindAdsHere.com

 - VisitLater.com