Tuesday, December 4, 2012

Changing Company Culture – Module 13


One of the things that draws me to GMI over the myriad of other companies that I’ve received offers from is the company culture. It is open, accepting, and encourages communication, growth, and learning. Honestly, I am proud to say that I am a GMI employee, and I absolutely love, (and am passionate about) what I do.

While GMI doesn't classify web analyst as an official role (they’re cast under the wider umbrella of software developer/analyst), I have no doubt that everyone involved – including our CIO Mike Martiny, and CMO Mark Addicks – are more than appreciable to what those people tell them. They are both incredibly receptive, forward thinking guys – and to them, every opinion matters.

Part of what makes our company culture function so well is that we are accepting of the data – and we measure almost everything. From the trending changes to the average age of the Lucky Charms buyer, to the happiness of our employees at work – everything is measured, constantly.

Throughout this course, I got a chance to see just how complex some of the things we measure really are, why we do them, and how that helps us become a better, more responsive organization.

Honestly, our company is so data driven, that I question whether there would be three things I could recommend. However – nothing is without improvement – so rather than recommend new measurements, here are three measurements / processes I might enhance:

    1.     Allow BU’s to count CI hours repeatedly rather than once a FY.
a.     For example, if I develop an automation process that saves the company 12,000 man-hours every year, I am only allowed to count that savings toward my BU’s CI once – and never again.
    
    2.     Inventory and measure our sever disk space and growth in such a way that we can automate the growth process.

    3.     Provide employees with evaluations both at mid-year, and years end in order to give them a sense of their objectives / current status.
In the end, I’m glad that GMI fosters data, openness, and collaboration – it’s part of what makes this company (and my job) great.

Hiring Analysts – Module 13


For this post, I chose a gentleman I know of through a friend at GMI named Earle Richards. Earle is a Digital Media Analyst for A&E, and he does what we’ve been discussing in this class for a living.

See Earle’s LinkedIn here:  http://www.linkedin.com/in/earlerichardsjr

As you can see, Earle has extensive experience in the field of Digital Marketing. He has served in both consultant and senior corporate roles, and has experience with Omniture, Cognos, and more.

Personally, I think Earle’s qualifications speak for themselves in many ways. To ensure that he would be a good fit for the position, I would both want to gauge his analytical thought processes by asking him questions that assess his ability to size up a situation – something like:  In situation X, what would you do and why?

Additionally, I would want to see if he had any implementation recommendations and why.
In the end, I feel that Earle would be an excellent candidate for the position based on his extensive experience, skills, and knowledge.

Web Analytics Jobs – Module 13


For this post, I chose a job posted on Monster.com with the company Vanguard – an investment management company.


I chose this job because it seems as though it would allow the candidate to interact with the business, while maintaining a wealth of technical knowledge and prowess. (Much like my current job). I also liked the fact that the company doesn't list a myriad of platform specific skills – which could mean either 1.) They haven’t the slightest idea what to do in terms of analytics, or 2.) They’re open to suggestion and interpretation as to the best way to proceed.

Some of the qualifications (skills) mentioned in the job posting include:

    1.     Proficiency with Microsoft Office; experience with Business Objects, Cognos, SQL and SAS   preferable.

'   2.     Strong analytical skills coupled with the ability to form hypotheses and draw meaningful insights from disparate data sources.

    3.     Past experience with WebTrends, Adobe SiteCatalyst, IBM Coremetrics, Comscore Digital Analytix, Google Analytics or other simlar tools required. Proficiency with SQL, HTML and Javascript will be essential.

In order to make myself more attractive to this position, I would:

    1.     Cite my experience gained through interacting with the GMI employees that have helped me apply material from this course to real world projects – including introducing me to Adobe SC.
    
    2.     Cite my experience performing analytics for small businesses for the past several years.

    3.     Show off my analytical prowess by showing Vangard some of the automation processes I’ve created for GMI (process analysis requires you to know something end-to-end – an immensely valuable skill in any sort of implementation role.

All in all, I think this would be a great position for me – as I have much of the experience, many of the technical qualifications, and a definite level of interest. To be completely honest, though, I don't think I’ll be leaving GMI anytime soon (I love my job! J)

Tuesday, November 27, 2012

Measurement Plan for Higher Learning – Module 12


For this post, I’ll make the obvious choice, and develop a measurement plan for LTU.

Using the same steps as in previous posts, we can easily develop a measurement plan for the university that is cost effective and provides actionable insight.
   
    1.     Use Your Mission to Define Your Objectives
a.     According to LTU’s website, their mission is to “To develop leaders through a student-centric environment with innovative and agile programs embracing theory and practice.”
                                               i.     As one can imagine, much of what KDP discusses in both chapters pertains to this mission. In order to fulfill its objectives, LTU must ensure that the proceeding steps are thoroughly fleshed out.

    2.     Identify and Prioritize Your Audiences
a.     LTU needs to identify their vital stakeholders, which will include several entities, such as:
                                               i.     Students
                                              ii.     Staff
                                            iii.     Deans
                                            iv.     Faculty
                                              v.     Parents
                                            vi.     Donors
                                           vii.     and more…

It will be vital that LTU prioritize and meet the various needs of these publics. For example, students will require things like quality education and plenty of extra circular activity. Parents – on the other hand – will want a safe, cost effective college environment. College deans will want to see the cohesion of all of these elements in the form of increased enrollment, national recognition and funding, and increased staff retention rates.

    3.     Define objectives
a.     LTU could define several objectives related to their overall mission. Some of those objectives might be:
                                               i.     Increased student enrollment
                                              ii.     Increased national recognition
                                            iii.     Increased student and staff retention
                                            iv.     Increase in funding

    4.     Establish a Benchmark
a.     LTU could establish benchmarks by comparing to other private universities in the U.S., as well as using historical data for reference..

    5.     Pick a Measurement Tool
a.     LTU could utilize nearly all of the measurement tools we’ve discussed in the course, ranging from newspaper and clip analysis, to website traffic analysis, to surveys and mass data compilation to measure their success. Much like the enterprise environment, LTU can also gauge its success against its peers.

    6.     Analyze Results and Make Changes
As LTU evaluates the information that would improve the university, it can then act on that information and convert it to action. Through the information attained in the program, LTU can reformulate itself both horizontally and vertically to ensure that it is meeting the needs of each and every one of its valuable publics.

References:

http://www.ltu.edu/presidents_office/mission.asp?_wds=cs

Distance Learning Case Study Discussion – Module 12


As most of us who have taken an online LTU course know, the university commonly deploys pre, mid, and post experience electronic surveys to assess the quality of their online instruction efforts. While I can see the value in this sort of standardized surveys, I don’t believe they serve to truly measure LTU’s program effectiveness.

Why – because they don't measure course effectiveness – they simply measure course experience.

For example, the evaluations ask simple questions like “did we (the student) feel the learning experience was effective, or do we believe that the materials were useful? These questions are almost always answered on a scale of 1-5. Questions like these merely provide a plug and play input for students, and a mush of data for instructors. The question is, just how actionable is that mush? Does it serve LTU’s primary audiences for these surveys (students, instructors, and staff) well?

The answer is no. The fact is, these surveys don't allow the institution or the instructor to see that the student actually took anything of value away from the experience – they don't test the “theory or practice” motto that the school supposedly staunchly supports.

Now – some of you might argue, “That's what finals are for…”
The problem with that argument though, is that finals are bias – they can be created by the instructor to be exceptionally easy, insanely difficult, or just plain off topic – at the same time, they can be manipulated to serve an instructors interests, or curved to improve percentages.

These palpable erosions of accuracy and richness of data serve to corrode relations with students, instructors, and faculty. If the students don't feel the surveys matter, they won’ fill them out (many do not). If the instructors cannot retrieve actionable data, why deploy the surveys in the first place? If high-level faculty cannot act on data, course experiences cannot be improved, and as a result, the entire university suffers.

To rectify some of these issues, I feel that the school should develop customized surveys using input from instructors that evaluate the student’s experience, as well as their acquired knowledge through a set of questions pertaining to the material. These surveys would be evaluated and scored by the same people who manage surveys now, and serve as a much more realistic indicator of course quality and effectiveness, and could be used both online and offline.

However, I would not allow these surveys to affect a students grade, rather, I would utilize this data to demonstrate where specific courses are effective, need improvement or modification, or perhaps need restructuring. A survey of this type would provide LTU with truly actionable information that both high-level staff and instructors could use to lead to improved course effectiveness and student learning.