Summary of "Definition of the quality team & evaluation sheets"

Purpose of the session

High-level takeaways

Quality Team structure

Who can volunteer (eligibility rules)

Roles & permissions (practical)

Evaluation system — sheets, criteria and scoring (methodology)

  1. Data collected per member
    • Personal: name, phone, email, university, academic level, department/committee.
    • Monthly/weekly task plan links (Google Drive links to internal/external plans).
  2. Core scoring elements (per task / per month)
    • Task quality (out of 10).
    • Task timeliness (deadline score out of 10).
    • Attendance at meetings (0 / 1 / 2 points): 2 = attended, 1 = excused, 0 = unexcused absence.
    • Event attendance and contribution (separate measure).
    • HR soft-skill/teamwork evaluation (communication, teamwork, PR messaging).
    • Warnings, bonuses and deductions: recorded with documented reasons.
  3. Evaluation responsibilities (who scores what)
    • Department Head / Face Head: evaluates task quality and head-level assessment.
    • HR: evaluates deadline confirmation and soft skills; issues official warnings/penalties/bonuses.
    • Review Committee (for events): evaluates event content using ERS/IR criteria from PR / Marketing / Academic perspectives.
    • Reporting Committee: compiles and aggregates the data, corrects technical errors, and feeds dashboards.
  4. Scoring aggregation and display
    • Task/member scores are summed monthly and averaged.
    • Dashboard uses colors to flag averages: red = below average (requires attention), green = top performer.
    • Ten-point scales convert to points; overall pass/certificate determined by percentage of total possible points across the season.
    • Heads/presidents can apply justified bonuses or minuses, but must document reasons.
  5. Warnings and HR process
    • Warnings are issued by HR per regulations (e.g., a standard policy such as 3 warnings → removal).
    • Any bonus or deduction must include a written reason in the HR box.
  6. Meetings/events & internal vs external plan
    • Internal plan = head-created personal development tasks for members (soft/technical skills).
    • External plan = tasks for model events/outputs visible to the public.
    • Heads submit monthly plans (by week) and learning outcomes; place Google Drive links in the sheet (not full files in cells).
  7. Appeals / complaints
    • Care System (complaint channel) exists for appeals or complaints; presidents may be asked to provide reports for review.
    • Members can request to see their reports if denied a certificate (right to request evidence).

Event evaluation & ranking system (ERS / IRs)

Monthly and aggregate reports

Practical instructions — step-by-step actions

For Heads when assigning a task:

  1. Fill in the monthly/weekly plan and include learning outcomes for internal development tasks.
  2. Provide the Google Drive link for the task plan in the sheet.
  3. Define the task’s deadline and scoring rubric (deadline score / quality score).

For Members when completing a task:

For HR and Heads doing evaluations:

For Review Committee after events:

For Reporting Committee staff:

Training, rollout and recruitment

Rules, culture and practical notes emphasized

Session format & extras

People / speakers / named sources

Practical checklist — next actions

If you are a member:

If you are a head/president:

If you want to volunteer in Quality:

End note

Quality control is designed as a consistent, documented and transparent monitoring system — to protect members’ rights and encourage improvement through documented feedback, not to punish arbitrarily. The system links individual monthly/weekly evaluations to team and model-level rankings and adds quality into the model league calculations (not just quantity).

Category ?

Educational


Share this summary


Is the summary off?

If you think the summary is inaccurate, you can reprocess it with the latest model.

Video