Deep dive into one of the 2019 AACSB Standards: How can faculty assess evidence-based decision making that integrates the application of statistical tools and techniques, data management, data analytics and information technology?
Simulations are great way to have students use all of the above tools. Many simulations (such as Responsive Learning Technologies’ Littlefield and Supply Chain games) have students compete on teams to come up with a strategy and ‘win’ the simulation by accumulating the most money. But the real learning comes after the preparation and the playing of the simulations, learning begins in writing up the results. By laying out guidelines on how to report their findings, including which statistical tools and techniques, data management, and data analytics they are to use, students are forced to wrestle with the underlying concepts of the simulation and present results to ‘upper management’ that are well-thought out and logical.
In my courses, I often require the students to rewrite the results (often more than once) before I accept what has been turned in. It is through the process of rewriting that the true learning emerges, the proverbial light bulbs go on, and the students begin to understand the ideas and concepts at a much deeper level.
Grading the reports using rubrics that embed learning outcomes such as how well the student has used the statistical tools, how well they understand and present the data, etc. are excellent ways to assess this standard. Building the rubric in Canvas allows for the easy aggregation of student learning outcomes across courses and programs.