A Chinese check on Algorithms

With algorithmic recommendations becoming ever-pervasive and each company using its own secret sauce to entice consumers, the Chinese government has turned to regulations

November 15, 2021 01:24 pm | Updated 01:28 pm IST

China's draft rules allow the country's authorities to inspect algorithms and request rectifications should they find problems. Representational image

China's draft rules allow the country's authorities to inspect algorithms and request rectifications should they find problems. Representational image

Algorithmic recommendations are at the heart of the digital transformation we are witnessing now. From buying laptops to groceries, most people have gradually adopted a new digital consumer lifestyle within a decade. The pandemic-induced policies have accelerated this shift and added more services that can be accessed online.

According to a report from an UN Conference on Trade and Development (UNCTAD), the e-commerce sector saw a “dramatic” rise in its share of all retail sales, to 19% from 16% in 2020. Online retail sales grew markedly in several countries, and global e-commerce sales jumped to $26.7 trillion in 2019, up 4% from 2018. That’s about 30% of the global gross domestic product (GDP).

(Sign up to our Technology newsletter, Today's Cache, for insights on emerging themes at the intersection of technology, business and policy. Click here to subscribe for free.)

A large proportion of these online purchases are made on smartphones and as we add items to our digital shopping cart, the app nudges us to consider alternative brands. It may also offer a related product, suggesting that other buyers bought the two items together. Unlike in a physical store, the online environment has no sales person to assist the buyer. Instead, a digital store’s algorithm recommends ideas to the customer.

Recommender systems

These algorithmic systems are not confined to online shopping. Social media firms, transport service providers and food-delivery portals use their proprietary algorithms to engage users. Their algorithms are built using ‘collaborative filtering’, ‘content-based filtering’ or hybrid method.

In a ‘collaborative filtering’ approach, an online site uses product rating data gathered from various users to recommend items to a specific user. This approach uses both explicit and implicit forms of data.

Also Read | China regulator says will step up efforts to build 'civilised internet'

Content-based filtering is impersonal and makes use of keywords to run its system. A classic example of this approach is Google search, which spews out a list of websites every time a user keys in words and phrases. The service doesn’t require personal data to give its output.

Most platforms combine collaborative and content-based filtering systems to make recommendations. Netflix is a good example of a hybrid recommender algorithm. The platform suggests films and series by comparing the watching and searching habits of similar users and combining it with content that a particular user has rated highly.

Some other platforms use session-based algorithms to make recommendations in real-time. For instance, if a user purchased a smartphone last time, and now starts a new session to buy a book, data on the previous purchase are irrelevant to the app to make a book recommendation. So, this method relies on a sequence of recent interactions to suggest buying ideas.

Problems in the system

Within a decade, these algorithms, now embedded in most applications, have metamorphosed into intelligent recommenders telling us what’s trending, what to buy, where to go on holiday, and which shows to watch. While algorithmic decision-making is useful in some cases, their use is problematic when applied to social settings. In 2017, some teachers in the U.S. state of Texas sued a multinational firm, SAS Institute, after the company’s proprietary algorithm was used by schools in Houston to judge teachers’ skills based on their students’ standardised test scores. The teachers complained that the company’s Educational Value-Added Assessment System, or EVAAS, ranked them low without giving a good reason. And they had no way to check whether the algorithm was fair or not, as the SAS Institute did not disclose how its rating system worked.

Also Read | Twitter finds its algorithms amplify the political right

A U.S. federal judge ruled that the use of EVAAS may violate teachers’ civil rights, and the school district paid the teachers’ fees and agreed to stop using the software.

China fills the gap

In December 2020, the European Commission had proposed draft rules on large tech firms that host content and make decisions about billions of users’ posts, comments, messages, photos, and videos. While the EU’s General Data Protection Regulation (GDPR) remains the gold standard for data protection rules, there is a void in the regulation of algorithms. China is looking to fill that void with a ground-breaking model to regulate algorithmic recommendations.

In August, the Cyberspace Administration of China (CAC) issued a statement that companies should not set up algorithm models that entice users to spend large amounts of money in a way that may disrupt public order. The country’s Draft Internet Information Service Algorithmic Recommendation Management Provisions provides a framework for the regulation of algorithmic recommendation.

“This policy marks the moment that China's tech regulation is not simply keeping pace with data regulations in the EU, but has gone beyond them,” said Kendra Shaefer, Partner at Beijing-based consultancy Trivium China.

The draft rules allow Chinese authorities to inspect algorithms and request rectifications should they find problems, the CAC added.

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.