Ghana: Build Capacities to Avoid Threats of AI - Panellists

Panellists at the workshop on lesson sharing have called on the public to build their capacities to avoid being threatened by Artificial Intelligence (AI).

Al is the technology enables machines to simulate human-like cognitive processes such as learning, reasoning, and decision-making, rapidly transforming how things are done from healthcare, finance, education, and media to entertainment.

They confirmed the existence of gender bias in the use of AI, attributing it to how data was collected and how the model was designed, therefore, calling for regulations to be developed.

They made the call when a research conducted on Al-systems revealed that Al application models that use household survey data to generate results, inadvertently disseminate results that were biased towards women in wealth estimates.

The survey was organised by AidData and the Ghana Centre for Democratic Development (CDD-Ghana).

The study utilised geospatial data and Ghana's Demographic and Health Surveys (DHS) data as its foundation to look into the nuances of gender bias in wealth estimates generated by Al.

The project was aimed at evaluating the potential of gender bias in wealth estimates generated using data from the Ghana demographic and health survey, along with satellite imagery which were used to construct the module.

The Director of Research of CDD-Ghana, Dr Edem Selormey, in a presentation on "Evaluating Gender Bias in Al Applications Using Household Survey Data,"said Al applications had transcended various domains, ushering in a new era of efficiency and innovative solutions to complex problems, "so understanding and addressing the implications of Al on society has become ever more imperative".

She said as AI was being integrated into activities, it was essential to critically examine its impact on different segments of the society, particularly on gender disparities.

Dr Selormey said the project team analysed the intricate relationship between Al models, wealth estimates, and gender dynamics.

"They used Al's capacity to discern and classify to reveal biases that may lie hidden within the DHS data, As you will see from the findings when they are presented, while Al models demonstrate their strength and usefulness, any level of bias, no matter how small, can cast shadows on accuracy,"

She was optimistic that Al would be harnessed for the betterment of all, and become a driving force for empowerment rather than one that perpetuates disparities.

A Research Scientist at AidData, Rachel Sayers, said with enough resources to measure outcomes, AI had helped to predict the outcomes of data in other areas more frequently.

She called for such approaches to be implemented to evaluate the results of policies and programs in the country.

AllAfrica publishes around 400 reports a day from more than 100 news organizations and over 500 other institutions and individuals, representing a diversity of positions on every topic. We publish news and views ranging from vigorous opponents of governments to government publications and spokespersons. Publishers named above each report are responsible for their own content, which AllAfrica does not have the legal right to edit or correct.

Articles and commentaries that identify allAfrica.com as the publisher are produced or commissioned by AllAfrica. To address comments or complaints, please Contact us.