
6.1 For teachers DiGiComp Integration into Module 1 Information and Data Literacy

Objectives of the topic
By the end of this module, adult educators will be able to:
Critically evaluate digital information sources for accuracy, credibility, and relevance.
Model effective search, filtering, and data management strategies for their learners.
Integrate information and data literacy competences into adult education practice.
Use AI and digital tools to support learners in navigating, analyzing, and organizing information.
Address ethical challenges around misinformation, bias, and AI reliability.
Differentiate instruction to meet diverse learner needs.
Align their teaching practice with DigCompEdu competence areas, especially:
Digital Resources: Selecting, evaluating, and adapting information sources .
Teaching and Learning: Embedding information literacy into activities.
Empowering Learners: Encouraging autonomy in information handling.
Facilitating Learners’ Digital Competence: Supporting learners to develop responsible, critical use of digital content.
Theoretical Description
Why Information and Data Literacy Matters in Adult Education
In today’s digital world, adult learners are constantly exposed to an overwhelming stream of information - from social media posts and online news articles to workplace platforms and e-government services. Without strong information and data literacy, they risk being misinformed, manipulated or unable to use digital resources effectively in both personal and professional contexts.
For trainers, teaching information and data literacy goes beyond showing how to ‘search on Google.’ It means guiding learners to:
Distinguish facts from misinformation and disinformation: Adult learners need critical strategies to verify the credibility of online sources, cross-check facts and recognize fake news, biased reporting or AI-generated misinformation.
Search effectively with purpose : Instead of ‘random browsing,’ learners must develop structured approaches to searching, using keywords, advanced filters and reliable databases. This skill is crucial for lifelong learning and professional development.
Organize and manage digital data responsibly: Information is only valuable if it can be stored, retrieved and used when needed. Trainers should help learners develop habits such as file naming conventions, using cloud storage, bookmarking and maintaining digital hygiene.
Understand bias, reliability and ethical implications of AI: With the growing role of AI-generated content, learners must critically evaluate how algorithms shape what they see online (filter bubbles, echo chambers) and reflect on issues of bias, fairness and accountability.
Recognize their digital footprint: Adult learners often underestimate how their online searches, clicks and posts leave a permanent trace. Trainers must build awareness of privacy, data tracking and long-term consequences of online behavior.
Role of Trainers
Personal Mastery:Trainers themselves must practice and model reliable search strategies, ethical data use, and digital organization skills.
Pedagogical Integration:Trainers should embed these skills into everyday learning tasks, so adult learners not only access information but also evaluate, contextualize and use it meaningfully.
By embedding information and data literacy into teaching practice, adult educators empower learners to become confident, critical, and autonomous participants in the digital society—capable of navigating the complexity of today’s information ecosystem.
Core Competences for Trainers
Finding and Evaluating Information: advanced search, credibility checks, bias awareness.
Organizing and Managing Data: structured storage, file naming, collaborative tools.
Critical Awareness: misinformation, disinformation, algorithmic bias, ethical sharing.
Using AI for Information Literacy: AI as a support tool (summarizing, comparing), combined with critical evaluation.
Differentiation: adapting fact-checking to low, medium and advanced digital literacy learners.
AI and Ethics Reminder
AI tools can be powerful assistants for summarization, translation or fact-checking — but they can also give inaccurate results, reflect bias or omit context. Trainers must teach learners to:
1) cross-check AI output with reliable sources,
2) question how AI tools generate answers,
3) reflect on the ethical use of AI-generated information.
Trainers should remind learners that AI is not a substitute for human judgement but a support tool that requires validation.
Practical Application for Trainers
Activity 1: Critical Source Comparison
· Collect 3 online articles on the same topic (e.g., ‘digital skills in the workplace’).
· Small groups compare authorship, publication source, evidence, bias.
· AI add-on: Ask AI to generate a quick summary of each article, then check where it may misrepresent facts.
· Reflection: ‘How would I scaffold this for my adult learners at different levels (low vs. advanced)?’
Activity 2: Digital Data Organization Workshop
· Create a shared folder structure for a sample course.
· Practice setting permissions (view, comment, edit).
· Explore version control and accountability.
· Real-life link: Show how the same method applies to managing job applications, personal records, or community projects.
Activity 3: Fact-Checking with AI and Human Sources
· Enter a controversial statement into ChatGPT/AI fact-check tools.
· Verify with official and reliable portals (WHO, EU FactCheck, Snopes).
· Ethics prompt: Discuss what risks occur if learners only trust AI without validation.
· Reflection: ‘How can I balance the speed of AI with human accuracy in my teaching practice?’
Optional Extension (Advanced learners): Data Bias Exploration
· Present an example of algorithmic bias (e.g., search results, AI translation issues).
· Ask groups to identify how bias can mislead decisions.
· Discussion: ‘How can adult learners recognize and counteract bias in daily digital life?’
Differentiation Strategies
Learner Level | Activity Adaptation | Example Tool | Trainer’s Role / Support |
Beginner | Use simple checklist with icons (✔credible / ❌ not credible) | Printed worksheet | Provide strong guidance, model each step, explain in plain language |
Intermediate | Guided comparison of 2 news items | Google Fact Check | Facilitate group work, give guiding questions, moderate discussion |
Advanced | Debate on algorithmic bias in search results | Google Scholar, AI fact-checker | Act as a facilitator, encourage independent research and critical argumentation |
Note for TrainersThese strategies are progressive - learners can move up from beginner to intermediate and advanced activities over time. Trainers should encourage progression without overwhelming learners.
Real-Life Scenarios
Real-Life Scenario 1: Health Misinformation Check
The Problem: A learner insists on a health claim they saw on Facebook (e.g., ‘drinking hot water prevents flu’). Other learners are unsure.
The Application (Trainer’s Role):
Trainer models how to search reliable health databases.
Trainer guides learners to use keywords strategically and cross-check multiple sources.
Trainer prompts reflection: ‘How do we decide which information to trust?’
The Result:Learners see how to apply structured searching and credibility checks.
Trainer connects this to DigCompEdu :
2.1 Selecting Digital Resources and 6.1 Information and Media Literacy.
Real-Life Scenario 2: Verifying Financial Advice
The Problem: Learners receive investment tips via WhatsApp/Telegram groups and want to know if they are reliable.
The Application (Trainer’s Role):
Trainer shows how to compare official vs. unofficial sources (government financial portals vs. social media).
Learners practice using fact-check platforms and evaluate credibility.
AI is used to summarize advice but then cross-checked with official sites.
The Result:Learners gain confidence in consulting official financial sources.
Trainer connects this to DigCompEdu:
3.2 Guidance, 5.3 Actively Engaging Learners, and 6.1 Information and Media Literacy.
Real-Life Scenario 3: Organizing Course Data Safely
The Problem:During a blended course, learners struggle with messy file sharing (different versions, lost documents).
The Application (Trainer’s Role):
Trainer guides learners in creating a shared folder structure (Google Drive/OneDrive).
Permissions (view, comment, edit) are practiced explicitly.
Trainer models naming conventions + version control.
The Result:Learners see how structured data management prevents confusion and builds accountability.
Trainer connects this to DigCompEdu:
2.3 Managing, Protecting, and Sharing Digital Resources + 6.2 Digital Communication and Collaboration + 6.4 Responsible Use.
Trainer Activity Plan
Title: Sharing & Collaborating in Digital SpacesTarget Group: Adult learners (mixed digital skills)Duration: 60 minutes (blended or online)
Learning Objectives:
· Learners practice using one collaborative tool (Google Docs/Padlet).
· Learners share a resource responsibly (with attribution and correct permissions).
· Learners reflect on teamwork challenges in digital environments and suggest solutions.
· Learners experience how AI can support group work through summarization or organization.
Materials:
· Access to Google Drive or Padlet.
· Example documents (one correctly attributed, one without attribution).
· Simple permissions guide (view/comment/edit).
· AI demo tool (ChatGPT or similar) for summarization.
Activity Flow:
Warm-up (5 min):Discussion prompt: ‘How do you usually share files with friends/colleagues? What problems have you experienced?’
Input (10 min):Trainer explains permission settings (view, comment, edit), attribution, and copyright basics. Short demo on how to share responsibly.
Group Task (20 min):In small groups, learners co-edit a shared document (e.g., ‘Group Digital Glossary’ or ‘Shared Learning Resource Fold’).
o Each group member contributes one entry/resource.
o Learners practice setting correct permissions and adding attribution.
Ethical Reflection (10 min):Guided discussion: ‘What risks exist when sharing files without thinking about privacy or copyright? How do we avoid these issues?’
AI Demo (5 min):Trainer shows how AI can summarize group notes or organize contributions. Learners discuss: ‘When is AI helpful, and when should we be cautious?’
Wrap-up (10 min):Learners share one strategy they will apply in their next online collaboration.Strategies can be written into a shared Padlet/Google Doc to create a collective ‘Collaboration Toolkit.’
Assessment:
Peer review: learners check each other’s contributions for accuracy and attribution.
Trainer evaluation: group product assessed for originality, collaboration, and responsible sharing.
Scenario-based quiz: short case questions on ethics, privacy, and collaboration challenges.
Self-reflection: quick exit poll: ‘What was the most useful new strategy you learned today?’
Reflection Prompts for Trainers
Self-Awareness: ‘How do I personally check credibility before using materials in class? Do I model this process transparently to learners?’
Pedagogy: ‘How can I scaffold fact-checking for low-literacy learners without overwhelming them?’
Relevance: ‘Which real-life issues (health, finance, social media) make digital literacy most urgent for my learners?’
AI and Ethics: ‘Am I teaching learners to critically question AI outputs and avoid overreliance?’
Professional Growth: ‘Where do I find reliable resources to keep my own information literacy up-to-date (e.g., fact-checking portals, academic sources, professional networks)?’
Collaboration: ‘How can I encourage peer-to-peer support in verifying information?’
Learner Empowerment: ‘Am I helping learners feel confident enough to challenge misinformation in their own communities?
DigCompEdu Mapping (Trainer’s Lens)
The module directly strengthens educators’ competences in selecting, adapting and teaching with reliable digital information. Trainers not only use credible data for their own professional growth but also embed these skills in lessons, guiding learners to validate, organize and responsibly share information.
DigCompEdu – Module 1: Information and Data Literacy (Mapping)
Area 1 – Professional Engagement
1.3 Reflective Practice: Trainers reflect on their own strategies for evaluating sources.
1.4 Digital Continuous Professional Development: Trainers use digital resources for their own ongoing information literacy development.
Area 2 – Digital Resources
2.1 Selecting Digital Resources: Trainers select high-quality, reliable information sources to integrate into teaching.
2.2 Creating and Modifying Digital Resources :Trainers adapt information (e.g., summarizing, simplifying, creating infographics) to fit learner needs.
2.3 Managing, Protecting, and Sharing Digital Resources: Trainers model file organization, naming conventions and permission settings.
Area 3 – Teaching and Learning
3.1 Teaching: Trainers embed information literacy (searching, evaluating, managing) into regular lessons.
3.2 Guidance: Trainers guide learners step by step in verifying sources and organizing information.
3.3 Collaborative Learning: Trainers design group-based source comparison and fact-checking activities.
3.4 Self-Regulated Learning: Trainers help learners plan and monitor their own information
validation processes.
Area 4 – Assessment
4.1 Assessment Strategies: Trainers use formative tasks (e.g., source evaluation exercises) to assess learners’ information literacy.
4.2 Analysing Evidence: Trainers review learners’ reasoning and digital practices to track progress.
4.3 Feedback and Planning: Trainers provide targeted feedback on learners’ information literacy tasks and guide next steps.
Area 5 – Empowering Learners
5.1 Accessibility and Inclusion: Trainers provide resources in multiple formats (PDF, audio, video, simplified text) to ensure inclusivity.
5.2 Differentiation and Personalisation: Trainers adapt fact-checking or search tasks to match learners’ digital skill levels.
5.3 Actively Engaging Learners: Trainers use real-life contexts (health, finance, workplace information) to make activities relevant.
Area 6 – Facilitating Learners’ Digital Competence
6.1 Information and Media Literacy: Core focus: evaluating and using digital information critically.
6.2 Communication and Collaboration: Trainers model responsible file sharing and teamwork.
6.3 Content Creation: Learners adapt or create simple fact-check visuals or infographics.
6.4 Responsible Use: Trainers raise awareness about ethical sharing, misinformation and digital footprints.
6.5 Digital Problem Solving: Learners apply strategies to overcome challenges in verifying or managing digital content.
Quiz
Now, when You have finished the theoretical part, we invite You to take the quick knowledge test, so You know where You are regarding the topic:
We have also prepared practical activity for this topic, which can be accessed by pressing the button below.
Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or the European Education and Culture Executive Agency (EACEA). Neither the European Union nor EACEA can be held responsible for them.

