Tracking and Reporting for your ‘proprietary’ CMS data

The database structure of a CMS will probably always be ‘proprietary’ and probably always be less than optimal for reporting out on such interesting questions as:

  • Which tools do my faculty use?
  • Which departments (Engineering, Biology, Languages) tend to use which tools?
  • Yes, faculty upload resources, but do all students actually read (access) those quiz guides or reading assignments?
  • How what percentage of our Instructors teaching in any given semester are using our course management system?

Notre Dame wanted to know these things. It would help us determine whether our CMS is even useful for teaching and learning, let us know where to invest our limited support efforts, let us determine ROI on 3rd party plug-ins like Wimba Voice Tools.

Our solution was to extract all tracking data into a flat file (using a smallish Blackboard service engagement for them to write the query since we were tearing our hair out with duplicate entries from crosslists and child sections), and then create an ODS from it. Our extraction process is nearly ready to be scheduled as incremental so that our ODS can continually have current data appended to it.

We use Business Objects and we use SunGard Banner’s ODS product. Therefore it made sense to place our extracted table in a Business Object student ‘universe’ along with our Banner Student ODS data so that we can answer departmental and classification kinds of questions.

It should be no surprise to anyone reading this that our first tool usage queries have determined that most course sections use no more than 2-4 seperate tools, and that 46% of our courses use ONLY the GradeBook tool!

You can also imagine that extracting your CMS data to a non-proprietary format will facilitate your CMS reporting over the years and through any number of course management system changes. After all, it is YOUR usage data, not the vendor’s.

3 thoughts on “Tracking and Reporting for your ‘proprietary’ CMS data

  1. how did the data compare to the standard student tool usage report from the powersight tracking report?

  2. The standard student tool usage report from the institution level for an entire semester would timeout before it completed!
    Our plan is to dump to the ODS on a scheduled basis. Since data appends, we figure why not do it every Friday night. (Reports would probably not be generated more than twice a semester).

Comments are closed.