Stories from Summer Institute 2013

 

Summer Institute 2013 main page

Français: "Mes impressions de l'Institut", par Sharon Hackett du Centre de documentation sur l'éducation des adultes et la condition féminine (CDÉACF)

 

 

 

The International Policy Context of PIAAC

At the start of the first day William Thorn, Senior Analyst at the OECD Education Directorate, placed PIAAC in the context of the OECD’s 2012 Skills Strategy, which aims to support governments in developing policies that will help maintain and improve the supply of skills in the labour market, and make better use of the skills workers have. The OECD has long had an interest in getting better measures of “human capital” and has attempted to broaden the range of skills measured in each international survey – not always successfully. For example, they had wanted to include the personal attributes of “grit” (perseverance) and “locus of control” (feeling in control of one’s destiny) in PIAAC, but neither worked well in the field trial, so they were dropped. Many Institute participants expressed regret that they were dropped and hoped that the OECD would find a way to integrate them into a later survey.

More successfully, PIAAC added “problem-solving in a technology-rich environment”, included the reading of digital text in its literacy assessment and added self-reports of skill use at work and in daily life to the background questionnaire. There are, however, limits to how many things can be measured in a single survey: Mr. Thorn reported that they “are at about the limit” of the time they can ask survey respondents to spend answering the survey and noted that PIAAC is “ultimately a compromise between the desirable and the possible.”

 

“No Imperialistic Claim on Literacy”: OECD’s William Thorn responds to PIAAC Critics

In response to critics who allege that surveys such as PIAAC represent a “dominant” view of literacy that is skewed towards its economic uses, William Thorn acknowledged that given its mandate, the OECD is focused largely on economic issues; however, the measure of literacy by these surveys can be useful for other contexts.  That said, Mr. Thorn continued, “We make no imperialistic claim that what we measure is literacy” to the exclusion of all other uses and visions of literacy.

 

The Purpose of Literacy Levels: Descriptive, not Normative

William Thorn of the OECD Education Directorate made it clear in a presentation and subsequent remarks that for him and his colleagues, the point of literacy levels is to describe what people at various points on the literacy continuum can do, not to make normative judgments of what skill levels people ought to have. Discussions about adult literacy in Canada have often included the claim that Level Three on the IALS/ALL/PIAAC is the “minimum level required by an individual to function in a modern society and economy”, but Thorn considers this claim to be “manifestly false”.  For example, he pointed out that previous surveys found that over 60% of Italians scored at under Level Three, “yet they’re managing…they’re an advanced country”.

Mr. Thorn made these remarks during a discussion of literacy levels, response probability values and cutoffs. While there are correlations between people’s literacy scores and a range of social and economic outcomes, he said, there is no definitive point at which people start being “incapable” of dealing with modern life.  In his synthesis of the day’s presentations and discussions, Stephen Reder, University Professor of Applied Linguistics at Portland State University, asked if “we’ve really thought through what levels are about” given that they tend to take on an unwarranted importance in policy.

On the second day of the Institute, adult educator Donald Lurette described how a franco-Ontarian program called the Centre d’apprentissage et de perfectionnement (CAP) moved away from prescribing what “level” learners should be aiming for and instead helping them to get the skills they need to reach objectives they’ve set for themselves, such as getting a job. Linda Jacobsen, a Senior Policy Analyst at the Public Health Agency of Canada, noted in her presentation that claims about Level Three don`t resonate in the health sector, mainly because they are concerned with making health information more accessible to everyone rather than trying to identify people with “health literacy problems”. On the other hand, Michel Simard, Director of Continuing Education and Services to Business at College Lionel-Groulx, stated in a presentation that he considered that students need to be at Level Three in order to succeed in college, although this statement was not intended to apply to the general population. The key point here is that the level of skills one needs depends on the context.

 

Dichotomies vs. Continuums

This was a theme throughout the Institute. In countries such as Canada, the United States, the United Kingdom and Australia, the OECD literacy assessment surveys have been a way of getting away from a dichotomous view of literacy, in which one is considered either literate or illiterate. However, Silvano Tocchi of Human Resources and Skills Development Canada acknowledged that this development may have been undercut in Canada by the importance given to Level Three. Meanwhile, a panel discussion about national and international literacy assessments revealed that France and Germany continue to use the “literate/functionally illiterate” paradigm in national surveys for largely pragmatic reasons: the purpose of these surveys is to identify those parts of the population with literacy needs. By contrast, Jennifer Coughran, Director of the Adult Literacy Policy Unit, Foundation Skills Branch of the Australian Government’s Department of Innovation, suggested that taking a “continuum” view of skills as they have in Australia could still help people meet literacy needs since people anywhere on the continuum might need to act to upgrade their skills at certain points in their lives.

 

 

The French Approach to Basic Skills Evaluation

As Jean-Pierre Jeantheau of the French Agence Nationale de Lutte contre l’Illettrisme (ANCLI) explained during the panel discussion and in a presentation on the third day, the French approach to assessing literacy in national surveys uses a different methodology from that in international surveys such as PIAAC, and focuses more on people with serious basic skills problems. Those identified as having basic skills problems in the two IVQ surveys (2004 and 2012) represented 11-12% of the general adult population. Distinctions are made between those who have been schooled in France (“illetré), people who “have never learned written code (“Analphabéte”) and those learning French as a foreign language. The sueveys suggested that polixy interventions had resulted in progress in improving the basic skills situation.

 

Probability response values

There has been some concern in the field about the effects that changing response probability (RP) values could have on the outcomes of international surveys. However, according to the OECD’s William Thorn, the change in RP from 0.8 in previous surveys to 0.67 in PIAAC will have no impact on scores or on the distribution of scores. This remained a question for many participants even at the end of the Institute and will likely require explanation when the results are released.

 

Real life Experience, Culture and International Assessments

Complex methodological issues arise when surveys attempt to measure the presence of a “latent” trait like literacy by presenting people with texts and then posing questions intended to test their comprehension of those texts. The issue is made more complex when, in international surveys, test items are devised in one country and “transported” to others.  Anthropologist Bryan Maddox, a consultant to the UNESCO Literacy Assessment and Monitoring Programme (LAMP), presented a case-study of one LAMP test item, specifically an ethnographic observation of nomadic herders in the Mongolian Gobi answering a test item about camels. Since camels were something they knew about from their daily lives, one might have expected that the herders would generally get this test item correct. In fact, many of them answered incorrectly. Why? Did their real-life familiarity with the subject distract them from the task of understanding the text and providing the answer required by the test? Maddox’s presentation was a reminder that while surveys like LAMP and PIAAC aim to simulate “real-life” situations, they are not actually tests of “real-life” knowledge.

 

 

Language Minorities

Jean-Pierre Jeantheau of ANCLI noted in a presentation on Day 2 that Canada is almost unique in allowing participants to select one of the two official languages (English and French) to undertake the IALS or PIAAC surveys. Even in a multilingual country such as Switzerland (which participated in IALSS, but decided not to be part of PIAAC), participants had to use the dominant language of the township (canton) where they lived at that time.

That said, 66% of self-identified francophones who lived outside Quebec decided to do the 2003 IALSS tests in English. The data show that while these participants identify French as their mother tongue, many of them actually use English at home. A participant from Saskatchewan noted that in her province, among others, formal French education was actually forbidden for many years, which could explain why many people who identify as French-speakers would be reluctant to take a written test in French.  This issue also shows that language identity and language use are not one and the same: someone might identify with a particular minority language yet end up mostly using the dominant language for practical reasons. Jeantheau expressed the hope that future surveys will address this question in their background questionnaires.

 

What about People over Age 65?

National and international skills surveys typically exclude adults aged 65 and older despite the fact that an increasingly large proportion of the adult population in OECD countries falls into that category. This seems to be because it is assumed that most people in that age group are retired or on the verge of retirement, and that upgrading their skills is not a worthwhile investment. A number of participants questioned this attitude.

 

 

An Update on PIAAC in Canada: “This is Canada’s Data”

In a panel on PIAAC in Canada, experts from Human Resources and Skills Development Canada and the Council of Ministers of Education Canada (CMEC), noted that Canada has taken a very large sample of more than 27,000 in the survey to be able to focus on particular groups within the Canadian population. For example, the sample of off-reserve Aboriginal Canadians was 5000, which is equivalent to the full national samples of many of the countries participating in PIAAC.

PIAAC in Canada Thematic Reports series: Six reports will be produced in 2014 and 2015 based on Canadian PIAAC data released in October 2013. The reports will be on:

  • Official language minorities
  • Aboriginal populations
  • Immigrants
  • Health and social outcomes
  • Education
  • Labour market

There will also be smaller reports in 2014 on competencies in Canada’s three northern territories and on the relationship between skills and earnings.

One panelist called on Canada’s literacy community to identify priorities for future research based on the data, saying “This is Canada’s data”. Another participant from HRSDC reiterated this point at the end of the Institute, stating that government’s resources are limited and so cannot be expected to do all the research that people would like done.

 

Health and Social Outcomes

One of the reports to be released as part of the PIAAC in Canada Thematic Reports series will be one on the health and social dimensions of skills (due in June 2015), which will be prepared by a team including representatives from the Public Health Agency of Canada (PHAC), HRSDC, CMEC, and the provinces of Alberta and Ontario.

Linda Jacobsen of PHAC presented at the Institute on how past surveys have influenced the health sector in Canada and what they anticipate coming out of the PIAAC survey. The 2003 IALSS, she said, led to a “ramping up” of health literacy activity in Canada since it showed that a large proportion of the population could have significant problems in dealing with health information. 55% of the IALSS literacy tasks measured health related activities, and data from these tasks were used for two Canadian Council on Learning (CCL) reports, Health Literacy in Canada: Initial Results from the International Adult Literacy and Skills Survey (2007), and Health Literacy in Canada: a Healthy Understanding (2008).

PIAAC does not contain as many health-related tasks, although the background questionnaire does include questions on health and will offer an opportunity to look at relationships between skills and health outcomes, particularly for certain “at-risk” groups.

Dr. Greg Brooks commented shortly after this presentation that he felt his native country, the UK, should pay more attention to the health and social dimensions of essential skills and that we should be broadening the scope of this research to other areas, particularly civic engagement.

 

Measuring the Impacts of Programs

One concern raised by participants even before the Institute was, if literacy and essential skills (LES) proficiency as measured by PIAAC shows little or no change from previous surveys, would this be taken as a reflection on the effectiveness of programs by governments, other funders and the media say this says about program effectiveness? How should the literacy field answer when people ask “What did we get for all the money we spent on these programs”?

Professor Greg Brooks noted on the second day that in the UK the field has already been dealing with that issue, since the Skills for Life strategy involved an investment of £ 4.5 billion (roughly $7.2 billion CDN) over a decade, and yet a comparison of the 2003 and 2011 surveys found no significant change in proficiency in the general population. Prof. Brooks pointed out that despite the size of the investment, only a small percentage of the population took part in programs so it would unrealistic to expect much of an effect on the overall numbers.

 

Program Impacts: Programs as Parking Lots or Busy Intersections  

In addition, as American researcher Stephen Reder explained in a joint presentation with Franco-Ontarian literacy practitioner Donald Lurette, the Longitudinal Study on Adult Learning (LSAL) suggests that programs don’t have a significant short-term impact on proficiency. Furthermore, the amount of time spent in a program makes little difference to outcomes. Participation in programs does have an impact on literacy practices and it seems that in the long term this positively impacts proficiency as well as a host of socio-economic factors.  These findings suggest a need to rethink the traditional “parking lot model” for programs in which what matters is how long students are “parked” in the program and replace it with a “busy intersection” model in which students come to programs from different directions and leave towards different destinations, and the program’s role is to provide them with the resources and support to embark on a persistent lifelong learning journey and reach their destinations (see our Ecologies of Learning: Culture, Context and Outcomes of Workplace LES research brief).

In the same presentation, Donald Lurette showed how his experiences in the field as a practitioner with the Centre d'apprentissage et de perfectionnement (CAP) in Hawkesbury,ON, have led him to similar conclusions. Working with the Réseau pour le développement de l'alphabétisme et des compétences (RESDAC), he has developed a pan-Canadian model for “integrated” literacy development in Francophone communities in Canada that has been adopted by RESDAC (see their publication Towards an Integrated Model to Support the Literacy Development of Francophones).

 

Program Impacts: Demonstration Projects Show Results in Workplace Literacy Programs

According to David Gyarmati and Boris Palameta, two researchers from SRDC, some programs do show quick results. They reported on the results of three recent Canadian demonstration projects, including one managed by The Centre for Literacy (Measures of Success), Upskill, a large-scale randomized control trial of a workplace intervention in the accommodations industry, and the BC Workplace Training Project. IALS informed the design of the Upskill survey. The researchers found significant short-term results (6 months post-program) overall, although some programs produced better results than others. “At risk” sub-groups showed as much improvement as other groups of learners. Important factors for program success included learner motivation, the ability to apply what is learned on the job, firm size (larger firms had better results) and training alignment with business needs.

 

Skills for the Workforce and Workplace

Michel Simard, Director of Continuing Education and Services to Business at College Lionel-Groulx, described the role that the college network can play in the area of raising levels of basic skills, and increasing the ability of workers to perform diverse tasks, adapt to rapid change and engage in further skill development. His college participated in a 2011-12 cross-Canadian study of community colleges which found that a large proportion of students coming into technical studies programs after spending some time outside of the education system had significant skills deficit. Mr. Simard described how local businesses in St-Jerome Quebec who hire the graduates from his college fed his thinking and shaped his decisions as he designed an intervention program to boost student skills.

Phillip Mondor of the Canadian Tourism Human Resource Council presented an employer perspective. Many people get their start in paid employment in this sector. He reported that these workers are less educated than the Canadian average and that the shortage of skilled workers is impeding productivity. Employers understand the need for training, said Mondor, but the sector consists mainly of small and medium enterprises which tend to find existing programs to be inflexible, with goals that are incompatible with business needs and little quality control. Some firms are trying to bypass the issue by investing in technology rather than people, although this brings up “literacy” issues around the ability of employees to use technology. 

In his presentation, Paul Bélanger, a researcher at the Université du Québec à Montréal, noted that while international surveys like PIAAC carry a lot of weight with policymakers because they provide a macro picture of skills in the general population, it is important to balance that research with more localized needs-assessments for workplaces. He also stated that the literacy “problem” we face today is not that people are becoming less skilled, but that skill demands are increasing and the skills people already have atrophy through non-use.

 

Themes raised in comments at the end of the Institute:

  • The possibilities for use of the rich data coming from PIAAC: “a very exciting time to be a researcher in this field” and the need to link practitioners with researchers so that research reflects the needs of the field
  • The need to complement PIAAC data with other data sets
  • Many people in the sector have struggled with an overly strict definition of literacy based on the one in the OECD surveys as provinces have aligned their policies with IALS, so let’s remember that PIAAC is a powerful tool, but not the only tool. It is not the “gold standard” of literacy
  • Let us remember what these surveys are for: they are not ends in themselves

 

by Paul Beaulieu, The Centre for Literacy

 

Return to Top