COAMFTE is in the process of providing additional resources and “Hot Tips” to assist our stakeholders with the COAMFTE accreditation process.  Now that Version 12 has been implemented for the past two years, COAMFTE has aggregated information collected from Commission reviews on commonly missed Key Elements or insufficient information noted in Eligibility Criteria, Self-Study, and Annual Report Review Letters.  In addition, COAMFTE has reviewed many programs that have been recently accredited without any stipulations.  By looking at both the deficits and the successful submissions, we would like to share with you some information that may assist programs with their Eligibility Criteria, Self-Study, and Annual Report submissions.

Eligibility Criteria

  1. Programs are advised to check that all links provided in the Eligibility Criteria are working (even after submission).
  2. Make sure all documents referenced are attached/linked in the Eligibility Criteria.
  3. Ensure that all policies/handbooks are publicly accessible on your program’s web page. 
    • Ensure that the link provided in Diversity Program Composition Section (EC-H) is publicly accessible. Please click here to view COAMFTE’s List of Items that Needs to be Publicly Accessible.
  4. Bookmark each individual component including appendices/supporting documents.
  5. Ensure that Eligibility Criteria is under 25 pages (not including appendices). If your document exceeds 25 pages, you may remove EC-E’s instructions/table and input “Not Applicable” (Only if you are renewing accreditation) and/or provide links to the required tables within the Eligibility Criteria document.
  6. Programs are advised to add page numbers that will help direct reviewers to where the requested policy/information is available.
  7. Programs that are resubmitting their Eligibility Criteria are not required to resubmit their entire Eligibility Criteria document. Programs are advised to only submit the information that were indicated as insufficient or required additional information. Please remove previously submitted information that the Eligibility Review Committee indicated as sufficient.


  1. Programs can greatly benefit by laying a foundation for the accreditation process by completing the Mission, Goals and Outcomes Template at the very start of the process.
    • Please note that each Student Learning Outcome needs to have targets and benchmarks that are measurable.
  2. Programs that develop and follow a systematic assessment plan linking to their SLOs and RESOURCES tends to assist programs in addressing many of the requirements of the Key Elements in Standards Version 12.
    • The assessment plan needs to describe the following:
      • How and when each STUDENT LEARNING OUTCOME is measured, gathered, reviewed and adjusted (if needed) AND
      • How program RESOURCES (physical, fiscal, clinical, instructional, student support services, technical, etc. that are identified in Standards III) are measured, gathered, reviewed and adjusted (if needed).
      • The process and timeline gathering, reviewing and analyzing data from relevant communities of interest (Hint: organize this in a chart).
    • Organize in a chart SLOs, measures, benchmarks, targets and changes based on the review of data.
    • Programs need to provide evidence and supporting data that they follow the assessment plan and describe how the findings have resulted in program improvements. (For example, programs need to show that all surveys have been implemented, reviewed and fed back into the program; AND show that assessment mechanisms for each SLO have been evaluated, reviewed, analyzed and fed back into the program.)
  3. It is important to note that in Version 12, programs need to identify formal processes (documented in writing) for review and analysis of data collected from all Communities of Interest and report their findings back to their COIs (programs cannot review through informal means only)
    • To demonstrate program improvement, programs need to document / provide evidence of the actions taken resulting from review of the data

    Avoiding the use using Grades as Assessment Measures for SLOs:
  4. For Key Element I-A, the use of grades is not a recommended assessment measure for student learning outcomes. In Standard V12, the definition for Assessment Measure is "a mechanism for evaluating progress and attainment of targets and benchmarks. Examples include exams, assignments or capstone projects with rubrics or practicum evaluation instruments.”
  5. For Key Element II-B, programs need to have a mechanism in place to evaluate the program’s climate of safety, respect and appreciation. The data from this mechanism needs to be documented and analyzed in order to be fed back into the program.

    Reporting on Sufficiency of Resources in Standards V12: 
  6. In Key Element I-B, programs are required to describe their "plan" for evaluating their resources
  7. In Standard III, programs need to show that the plan for evaluating resources/environmental supports was implemented by:
    1. Briefly Identifying/clarifying their definition of sufficiency of resources (explain how you determine that the program has resources sufficient to achieve the program's mission, goals and SLOs) 
      • Programs may want to indicate the threshold that signals the program when a change is needed
    2. Providing aggregated input from communities of interests that informed the review of resources in relation to the program's definition (e.g. survey results)
    3. Providing evidence that the review took place (e.g. faculty meeting minutes)
    4. Describing what was found upon the review of said input during the evaluation of resources
  8. In Key Element V-D, programs need to describe how they used the used the feedback around resources for program improvement/effectiveness (action taken)
  9. Sufficiency
    • Sufficiency is not synonymous with satisfaction.  Students may want X resources but need only Y resources to meet their SLOs.
    • Program should establish their definition of sufficiency and survey the relevant stakeholders (ex. students) based on that definition.
  10. For Key Element III-B, programs need to provide documented evidence that they are using technology and methods for supervision that is HIPPA compliant (such as HIPPA contracts/documentation for program specified software, if using technology mediated supervision).
  11. Visual guides such as a chart and or table to display information such as links to SLOs, curriculum or timelines are useful review aids for the Commission. For example:
    • For Key Elements I-B, II-B and IV-A, programs may want to use a chart or table to link teaching/learning practices to each SLO
    • Programs may want to identify whether or not the program is meeting the benchmarks or goals by using a chart or table for each SLO. If the program is not meeting a benchmark or goal, the program needs to describe what is the program doing to help meet or modify the goal.
    • Programs may want to use a chart to show their assessment plan (as mentioned above).
    Informal vs. Documented Evidence
    • Ensure that meetings minutes are taken to use as evidence of discussion, changes, improvements.
  12. Client Contact Hour and Alternative Hours
    • In the same physical space – no teletherapy.
  13. Links
    • Check your links before submitting!  If a link doesn’t work, the Commission can’t review and will mark the key element as deficient.

Site Visit

  1. Read carefully through the Self-Study Review Letter and be prepared to address areas noted as deficient by Commission
  2. Have any new information (such as recent meeting minutes/surveys/data) available for the Site Visit Team
  3. Organize resource room so the team doesn’t have to ask for resources when arrive on-site. Ensure that the room locks and a key can be provided to the Site Visit Team for the duration of the site visit
  4. Make sure the materials are accessible for the Site Visit Team. Google drives are difficult to access and navigate
  5. Ensure that all MFT faculty read the Self-Study and are familiar with the information.  
  6. Ensure faculty, supervisors and students are familiar with SLOs, Program Goals and program mission and how they are linked
  7. Ensure that faculty are aware of how Communities of Interest provide feedback and how it is incorporated into the program
  8. Ensure students understand how they are evaluated and how those evaluations are linked to SLOs
  9. Ensure that minutes reflect the decision and rationale for evaluating data points
  10. Recommend conducting a mock site visit with each group the team meets.  Develop questions and help all groups prepare
  11. If you have questions about the Site Visit or the accreditation process, call COAMFTE staff

Annual Report

  1. When filling out the Annual Report, check your program's Student Achievement Criteria (SAC) Data Disclosure Table and the data that is in the Annual Report. The information on the program's website should be consistent with the information in the program's Annual Report.  In other words, the Student Achievement Criteria Data Disclosure (MC-B) that is presented in the Annual Report should reflect the Student Achievement Criteria disclosure table presented on the program's website. If your SAC Data Disclosure Table updates daily/monthly, please let the Commission know by providing contextual information in your Annual Report.
    • If the data on the program’s website differs from the data in the Annual Report, this raises a concern for the Commission about the accuracy of data. This could result in a Special Report and a special report fee.
  2. In regards to the Student Achievement Criteria Table located on the program’s website:
    • Include the most recent cohort on your SAC table even if you are still collecting data.
    • For cohorts that your program is in the process of collecting data, please input “In process” instead of leaving table cells blank or inputting “NA”.
    • The purpose of publishing the SAC table on programmatic websites is to increase transparency of student performance and achievement.  Programs are encouraged to use it as a recruiting tool.  Prospective students, parents and the public can make informed decisions about the program based on the published information provided by the program.  
  3. After submitting the Annual Report, continue to check that the link to the program's landing/homepage and the link to the program's Student Achievement Criteria Data Disclosure Table are working.
  4. Programs are required to collect data on Job Placement Rates.  Please have an ongoing process in place to survey or collect feedback from students and graduates regarding their employment using their MFT skills.
  5. When submitting evidence of financial viability, programs must submit a signed letter from the institutional administrator.  The program’s budget (only) is not an acceptable form of evidence of financial viability for Maintenance Criterion A. 
    • The letter must:
      • be dated
      • on the institution's letter head
      • contain a signature of the institutional administrator that has financial oversight of the program's budget, who is not serving as the program director. (ex. Department Chair, Dean, Provost)
      • indicate that there is support from the institution that resources are in place for the MFT program
  6. Providing contextual information is required in the following circumstances:
    • To explain a lack of data
    • Low student achievement rates (exam pass rates).
      • If the resulting low student achievement or lack of data is beyond the program’s control (student transfers out of program, takes a sabbatical, low survey response rate, etc.) and the program provides contextual information demonstrating that they did all they could to explain it, the Commission can close out a cohort based on the contextual information.
  7. Programs need to remain engaged with their alumnae to gather the information required for the Annual Report. Programs may want to consider the following steps to help collect important data from graduates:
    • Ask students, who will be soon graduating, for their contact information, especially current email addresses (e.g. in Exit interview or Exit survey)
    • Maintain alumnae contact information and collect data from graduates for at least 3-5 years.
    • Set up an online survey to send to graduates, with suggested questions below: 
      • What year did you enter our program? (cohort information)
      • What year did you graduate?
      • Did you take (sit for) the National Exam this year?
      • Have you passed the National Exam? If so, when (year)
      • Are you an LMFT? If so, what year did you get licensed?
      • Are you currently working using the skills gained in the MFT program? If so, when did you start your job (year)? 
      • What type of employment setting do you work?
      • Have you gone on to enroll in a doctoral program in MFT?