Summary of "Definition of the quality team & evaluation sheets"
Purpose of the session
- Explain the role, structure and practical work of the new Quality Team (2026).
- Show how members, heads and presidents will be evaluated using evaluation sheets and the Event Evaluation & Ranking System (ERS / IRs).
- Announce volunteering openings, eligibility and training for the Quality Team.
High-level takeaways
- The Quality Team is focused on monitoring and guidance, not execution. Central committees (HR, PR, Marketing, Logistics, Academic) remain responsible for action; Quality monitors, reports, reviews and recommends.
- The Quality Team is made of three parts: Reporting Committee, Review Committee, and a Legislation (Regulations) Subcommittee — each with distinct responsibilities and eligibility rules.
- Evaluations use structured sheets (monthly/weekly) and dashboards. Scores combine task quality, timeliness, attendance, event contribution and HR soft-skill checks. Scores feed an automated dashboard and a league ranking that now includes quality as well as quantity.
- Confidentiality and separation of duties are emphasized: reporting volunteers cannot evaluate their own model, and access to sheets is limited.
Quality Team structure
- Quality Control Leader (and Assistant).
- Reporting Committee
- One or more heads and reporting members.
- Prepares and collects member/head evaluation reports and checks technical correctness of evaluation sheets.
- Responsible for data entry, maintenance and ensuring the system (sheets/formulas) works.
- Review Committee (umbrella)
- Evaluates public event outputs via ERS/IRs.
- Contains departments/sections: Academic, PR (Public Relations), Marketing.
- Heads and members in each department evaluate events from their specialty.
- Legislation (Regulations) Subcommittee
- Smaller group that periodically reviews and updates regulations.
- Coordinates with Reporting Committee to ensure regulations are embedded into reporting.
Who can volunteer (eligibility rules)
- General rule: minimum one full season / one year of experience in the model/program category (limited exceptions may apply).
- Reporting Committee
- Must have at least one year of Alpha Model experience.
- Cannot be a current president/head in the model being evaluated (confidentiality).
- Requires a laptop.
- Review Committee
- Prefers at least one season of experience.
- Volunteers may be active in their model (since they evaluate public work), but must leave model groups when necessary to preserve transparency.
- Legislation Subcommittee
- Restricted to mentors and designated senior roles (mentors named during the session).
- Small allowance: up to 20% of recruits may be recent graduates (with proven prior activity), not current students.
Roles & permissions (practical)
- Reporting Committee
- Collects and processes evaluation sheets, fixes technical issues, ensures data confidentiality, prepares aggregated dashboards and reports for presidents/HR.
- Review Committee
- Evaluates events via ERS/IRs using department-specific criteria (PR, Marketing, Academic).
- HR department (central model role)
- Issues warnings, applies penalties/bonuses based on rules; HR and head jointly record and justify HR actions.
- Central committees/models
- Apply actions (warnings, sanctions); Quality reports and recommends only.
- Access to sheets
- Restricted: only specific roles (president, vice president, head of HR, HR) see particular aggregated dashboards.
- Raw formulas and admin tabs are blocked from general editing.
Evaluation system — sheets, criteria and scoring (methodology)
- Data collected per member
- Personal: name, phone, email, university, academic level, department/committee.
- Monthly/weekly task plan links (Google Drive links to internal/external plans).
- Core scoring elements (per task / per month)
- Task quality (out of 10).
- Task timeliness (deadline score out of 10).
- Attendance at meetings (0 / 1 / 2 points): 2 = attended, 1 = excused, 0 = unexcused absence.
- Event attendance and contribution (separate measure).
- HR soft-skill/teamwork evaluation (communication, teamwork, PR messaging).
- Warnings, bonuses and deductions: recorded with documented reasons.
- Evaluation responsibilities (who scores what)
- Department Head / Face Head: evaluates task quality and head-level assessment.
- HR: evaluates deadline confirmation and soft skills; issues official warnings/penalties/bonuses.
- Review Committee (for events): evaluates event content using ERS/IR criteria from PR / Marketing / Academic perspectives.
- Reporting Committee: compiles and aggregates the data, corrects technical errors, and feeds dashboards.
- Scoring aggregation and display
- Task/member scores are summed monthly and averaged.
- Dashboard uses colors to flag averages: red = below average (requires attention), green = top performer.
- Ten-point scales convert to points; overall pass/certificate determined by percentage of total possible points across the season.
- Heads/presidents can apply justified bonuses or minuses, but must document reasons.
- Warnings and HR process
- Warnings are issued by HR per regulations (e.g., a standard policy such as 3 warnings → removal).
- Any bonus or deduction must include a written reason in the HR box.
- Meetings/events & internal vs external plan
- Internal plan = head-created personal development tasks for members (soft/technical skills).
- External plan = tasks for model events/outputs visible to the public.
- Heads submit monthly plans (by week) and learning outcomes; place Google Drive links in the sheet (not full files in cells).
- Appeals / complaints
- Care System (complaint channel) exists for appeals or complaints; presidents may be asked to provide reports for review.
- Members can request to see their reports if denied a certificate (right to request evidence).
Event evaluation & ranking system (ERS / IRs)
- ERS / IRs = Event Evaluation and Ranking System.
- Workflow
- President / vice-president / head completes the event entry (event code, event type).
- After event completion, Review Committee departments (Academic / PR / Marketing) evaluate the event by department-specific criteria (e.g., PR tone, academic content quality, marketing continuity).
- Evaluations feed into the QC Review tab and produce an event ranking.
- League ranking
- 2026 update: league ranking combines quality and quantity.
- Models earn points for event quantity and for event quality (via ERS) — equal event counts are differentiated by quality.
- Event types and requirements are defined and organisers must follow them.
Monthly and aggregate reports
- Heads use a monthly member sheet for month-by-month scoring.
- Presidents receive an aggregated “intelligence” sheet summarizing all five committees, per-committee rankings, and full-model ranking.
- Access to aggregated sheets is limited (president, vice president, head of HR, head HR deputy).
- Formulas and internal calculation tabs are locked; members see summarized outputs only.
Practical instructions — step-by-step actions
For Heads when assigning a task:
- Fill in the monthly/weekly plan and include learning outcomes for internal development tasks.
- Provide the Google Drive link for the task plan in the sheet.
- Define the task’s deadline and scoring rubric (deadline score / quality score).
For Members when completing a task:
- Submit deliverables as required (avoid last-minute or AI-only submissions).
- Ensure profile fields are filled (name, contact, university, level, department) — missing info may forfeit rights/certificates.
- If absent, upload an excuse in the notes: excused = 1 attendance point, attended = 2, unexcused = 0.
For HR and Heads doing evaluations:
- Evaluate deadlines independently (both head and HR).
- Head evaluates quality (content, technical correctness); HR evaluates soft skills and teamwork.
- When issuing warnings, document the reason and cite the regulation used.
For Review Committee after events:
- Use the ERS/IRs template and department-specific criteria to score PR, Marketing and Academic aspects.
- Submit scores to the QC Review tab; mark the event Done so it appears in the review list.
For Reporting Committee staff:
- Fix sheet technical problems, maintain formulas, aggregate data and produce dashboards for presidents.
- Keep all reports confidential and enforce correct access controls.
Training, rollout and recruitment
- Quality team will provide training for volunteers (how to use sheets, formulas, and evaluate correctly).
- Reporting volunteers require a laptop and one season experience; Review volunteers need experience (but may remain active in their model when appropriate).
- Application form and Quality Proposal link will be shared; read requirements carefully before applying.
- Timeline: evaluations and full sheet availability expected to roll out more broadly around March–April.
Rules, culture and practical notes emphasized
- Separation of duty and transparency are crucial: reporting volunteers cannot be current presidents of the evaluated model.
- Rules apply to everyone equally — mentors, presidents, members.
- Document all bonuses and penalties; no arbitrary changes based on mood.
- Quality exists to help and protect members’ rights, not to police unfairly.
- Appreciation and volunteer spirit are central — the system was built pro bono by a team that invested many hours.
Session format & extras
- Presentation included an interactive game/quiz to check attention.
- Demonstration of the sheet/dashboard, scoring colors, ranking logic, and bonus/minus operations.
- Announcement: volunteer applications opening for the Quality Team (Reporting slots emphasized).
- Training promise: new volunteers will be trained on sheets and formulas.
People / speakers / named sources
- Basmala Mohamed — Logistics member (session logistics).
- Dr. Mario (Mario Shenouda) — Presenter; Quality Control Leader (2026) and main speaker; designer of the evaluation system.
- Mentors / deputies:
- Essam Alaa — mentor for models
- Mariam Saad — deputy mentor (models)
- Ayman Wahid — deputy mentor (models)
- Mohamed Khafagy — mentor for programs
- Mr. Abdullah — deputy mentor (programs)
- Team contributors:
- Shahd Atef, Rana Rami, Jana Hossam, Nahla Essam, Mariam Tarek (recent join)
- Washstack team — thanked for logistics/organization support.
- Other participants (question contributors): Kawthar, Rafida, Mahmoud Mohamed, Ayman, Ayman Gamal, Menna, Noura, Wissam, Yasmine, Seham, and others.
Practical checklist — next actions
If you are a member:
- Complete your personal data in the head’s sheet.
- Learn the evaluation criteria for your committee.
- Ask your head for progress reports.
- Use the Care System to appeal if needed.
If you are a head/president:
- Prepare monthly internal & external plans, set learning outcomes.
- Submit Google Drive links in the sheet.
- Document reasons for any HR sanctions or bonuses.
If you want to volunteer in Quality:
- Read the Quality Proposal and application link (to be distributed).
- Ensure you meet experience requirements (one full season) and, for Reporting, have a laptop.
- Be ready to attend training sessions on the sheets and evaluation process.
End note
Quality control is designed as a consistent, documented and transparent monitoring system — to protect members’ rights and encourage improvement through documented feedback, not to punish arbitrarily. The system links individual monthly/weekly evaluations to team and model-level rankings and adds quality into the model league calculations (not just quantity).
Category
Educational
Share this summary
Is the summary off?
If you think the summary is inaccurate, you can reprocess it with the latest model.
Preparing reprocess...