Back-to-back @ ISR

My paper with Muller Cheung–“Seeker Exemplars and Quantitative Ideation Outcomes in Crowdsourcing Contests”–has been accepted by Information Systems Research. A key insight in this work is how the common practice of showing seeker exemplars in ideation contests to guide and inspire solvers could result in (i) a smaller and less diverse search space explored and (ii) fewer and lower quality ideas submitted by solvers. In other words, the benefits that seeker exemplars bring to ideation contests might come at some expenses to the seekers.

As with all my projects so far, there is a backstory about this paper’s path-to-publication; the details for another time, maybe. A shoutout to the ISR editors and reviewers, who gave us great feedback to bring our work to a higher level. Extremely glad that this research has found a nice home and, at the same time, giving me my first back-to-back ISR publications.

Abstract: Idea seekers in crowdsourcing ideation contests often provide solution exemplars to guide solvers in developing ideas. Solvers can also use these exemplars to infer seekers’ preferences when generating ideas. In this study, we delve into solvers’ ideation process and examine how seeker exemplars affect the quantitative outcomes in solvers’ scanning, shortlisting, and selection of ideas; these ideation activities relate to the Search and Evaluate stage of the Knowledge Reuse for Innovation model. We theorize that solvers’ use of local (problem-related) and/or distant (problem-unrelated) seeker exemplars in the respective search and evaluation activities is affected by their belief and emphasis in contests as well as the influences of processing fluency and confirmation bias during idea generation. Consequently, local and distant seeker exemplars have different effects in different ideation activities. Consistent with our theorizing, the results from an ideation contest experiment show that, compared to not showing any seeker exemplars, providing these exemplars either does not affect or could even hurt the quantitative outcomes in the respective ideation activities. We find that solvers generally search for, shortlist, and/or submit fewer ideas when shown certain seeker exemplars. Moreover, solvers who submit fewer ideas tend to submit lower quality ideas on average. Thus, showing seeker exemplars, which contest platforms encourage and seekers often do, could negatively affect quantitative ideation outcomes and thereby impair idea quality. We discuss the theoretical and practical implications of this research.

The final draft is available at https://ssrn.com/abstract_id=3859023.

2020 – what a year!

Things have been rather slow moving this year. Here’s a quick recap of the few hits and misses in 2020.

  • Tenured, thanks heaps to folks in the field and department. (You know who you are!)
  • Had a paper desk rejected (again). Oh well. On a brighter note, managed to push a couple of papers to the next round.
  • Taught entire courses online (two undergraduate sections and two MBA/MSc sections) for the first time. The experience wasn’t as bad as I thought; the teaching evals weren’t too shabby for some of the sections.
  • Took up the Director role for the Global Business program. So there goes my plan to just cruise around for awhile. On the other hand, it’s great to work among the fine young minds in the B-school.

Looking forward to a better 2021.

Running Online Exams

We are now doing 100% online teaching, which means the exam is also done online. Here are some things that I have done in setting/administering the exams (hopefully this will be useful to some of you). My objective is for all students to have a qualitatively similar individual exam with minimal incentives to help one another.

  1. Question Randomizing. I use Qualtrics to set the questions. My main objective is to test all students on the same set of N concepts. For each concept, I will have M different but interchangable questions. My assumption is that any students who know a concept will be able to answer any of the M questions. The exam is randomized such that (i) the question sequence varies across students (i.e., students are tested on different concepts at different point during the exam) and (ii) the question for each concept varies across students (i.e., students are tested by different questions for each concept). If N and M are large (and the exam duration is relatively short), then the challenge for students to help one another during the exam is going to be greater.
  2. Forward Move Only. I set the questions just that students answer one question at a time, and they have to answer a question before moving on to the next. They cannot go back to answer a previous question.
  3. No Incentives To Be Helpful. Because of the question randomizing, I told students that they should turn of their mobile phone and log off from all social media and commuincation accounts (whatsapp/FB/IG/email/etc.). This is because the short exam duration (30 minutes) and format (randomized questions + move move only) mean they have little time and incentives to give help to others; so getting off their phones and accounts take away any pressure for them to give help. And because other students may not be able to help them due to the exam format, they should not waste time asking or waiting for help.
  4. Open-Everything. The exam is open book and students can have access to any course materials, online resources, etc. The only thing they cannot do is to have live support from their friends, parents, spouses, children, etc.
  5. Clear EoE. If your exam questions are not numbered in the system (like in my case, due to randomization), make sure you have a very clear “end of exam” page and communicate this to the students before the exam begins so that they know what to expect. I tell students that they basically have to keep answering any questions that they see until there is no more question to answer. My “end of exam” page shows

You are done with the exam, but…

– Do not turn off the webcam or log off until the instructor says you can.
– Remain seated in front of your webcam, fold your arms, keep your hands away from any electronic devices (phones, computer, etc.)
– Do not communicate any aspects of this exam with other students until the instructor sends out an “all-clear” annoucement.

If you have any good tips about running online exams, do share.

Shocking! What I teach actually works!

Received a feedback from a student who took my Introduce to IS course in Spring 2018:

Given that this course is offered to pre-major freshmen/sophomores, I seldom get to hear from them after the course. But it always feels great when students come back to tell you that they could apply what they learnt in their work subsequently. For me, this course outcome is more meaningful than students’ grades or satisfaction. Yet, as I told the student above, it is more than just my teaching or the course content. You need students with the right attitude and who want to be inspired. We don’t often get such students, but when we have them in our classroom, it makes a world of difference to our teaching and the learning environment.

Nonetheless, I still take whatever credit that is given to me for the student’s outstanding performance at the bank 🙂

Paper accepted @ ISR

My paper “Adopting Seekers’ Solution Exemplars in Ideation Contests: Antecedents and Consequences” has been accepted at Information Systems Research.

Abstract: To benefit from the wisdom of the crowd in ideation contests, seekers should understand how their involvement affects solvers’ ideation and the ensuing ideas. This present study addresses this need by examining the antecedents and consequences of solvers’ exemplar adoption (i.e., use of solution exemplars that the seekers provide) in such contests. We theorize how the characteristics of seekers’ exemplars (specifically, quantity and variability) and prizes jointly influence exemplar adoption. We also consider how exemplar adoption affects the effectiveness of the resulting ideas, conditional on solvers’ experience with the problem domain of the contests. The results from a company naming contest and an ad design contest show that exemplar quantity and exemplar variability both positively affect exemplar adoption, but the effects are strengthened and attenuated, respectively, by prize attractiveness. The outcomes of a campaign using the ads from the design contest further show that greater exemplar adoption improves ad effectiveness (in terms of click-through performance), although this is negatively moderated by solvers’ domain experience. We discuss the theoretical and practical contributions of this research to ideation contests.

To view the final draft: http://ssrn.com/abstract=3034630. Credits to the ISR review team for helping me improve this research. The paper went through a few rounds of review and expanded from one study in the initial submission to three studies in the final version. I also received valuable comments from many colleagues in the community. The end product is definitely a much better piece of work.

Lovely start to 2018

Received a new year greeting from a former student today:

Happy new year to you! Long time have not contacted you but I really want to thank you again at this time for the IS intro courses you brought to me! I am now doing a tech-startup intern which is platform-oriented, and I found the concept I learnt from the course really helps me understand the work and think about the business logically. I am now doing the major of dual degree in CS and business and want to explore the tech industry more. Hope I have the chance in the future to hear great insights from you like I did in the 2010 courses! Thank you so much! Wish you a happy and enriching 2018!

It is always nice to hear from students that what they learnt in my courses actually help them in their work and cause them to explore more. Coincidentally, I was in the middle of preparing and revising the syllabus for the said Intro. to IS course for the coming semester when I received the email. For the last few years, my colleagues and I have been teaching this course in a “reformed” way: instead of simply teaching off an introductory textbook, we try to make the course more relevant and applicable by covering certain key IS topics and ideas in today’s economy (e.g., platform economics, social media strategy, big data analytics, etc.) as well as traditional core IS concepts. However, some students feel that the way  this course has been taught does not help them in the other IS courses that they take subsequently. A senior colleague heard such feedback and suggested that we should perhaps teach the Intro. to IS course in a more “traditional” manner — just like how introductory courses for marketing and accounting are being structured, where the focus is on the fundamentals, basic principles, etc. in the respective disciplines. Over the past few weeks, I have been trying to decide whether/how to revert the course to the way it was taught in the old days and which topics to drop. Well, the student’s greeting just made the decisions clearer.

 

 

Back in the game…

I always have a soft spot for tech startups. I co-founded one when I was a college freshman many years ago — that was way before when launching a startup is cool and fashionable. Running that startup was fun and exciting but it also brought along a great deal of uncertainty, a tad too much for my wife’s comfort. After a number of years with that startup, I took a difficult decision to exit it so as to pursue a Ph.D.. But who knew that grad school was fun and exciting but it also brought along a great deal of uncertainty, a tad too much for my wife’s comfort… 🙂 (Doing Ph.D./research actually shares many similarities with running a startup; I will save that topic for another post.)

So when I received my Ph.D., I promised my wife that I would behave myself and focus on my academic career for 6 years — that’s roughly the amount of time needed for a rookie assistant professor to beef up his profile to apply for tenure. No startups (or other funny ideas) before that. Nevertheless, due to my work and research interest, I have many opportunities to advise startups. Although a few of these startups have interesting business propositions, I always resist the temptation to be too involved in them.

Well, the situation has just changed. Some months ago, I was roped into a tech startup by a friend, taking a non-executive role. I can’t reveal much about the company at this point, but I can say that it is involved with a pretty exciting technology. One that I believe is going to be the backbone for many tech trends and products in the future. And one that passes the “strategic value” criteria that I talk about in class; sometimes you just have to walk the talk and put the money where your mouth is…

It is not all rosy

A few weeks ago, I noticed that the feedback that I shared on this website about my undergraduate course Introduction to Information Systems (ISOM2010) is mainly positive. I feel I should also show other types of feedback that I have received so as to give a more accurate picture. However, students only email or tell me about their course experience when they have nice things to say (understandably so). Although students had given some not-too-positive feedback in the course evaluation previously, those comments were usually not very “juicy” in the sense they were mainly about the heavy workload, etc., which I actually show to current students in class so as to help them manage their expectation. So I thought I should wait for the latest evaluations to come in and see if I can put together more “dark side” of my teaching before making a post. Right from the first lecture in this semester, I could sense that the vibe among certain students in one of the sections (L2) wasn’t too positive.

Well, I received the evaluation reports this morning and I must say that this year’s students, especially those from L2, did not disappoint. A few students wrote passionately about their horrible experience with my teaching. In fact, I was so bad that I achieved a personal career low instructor rating. A pretty humbling experience, I must say. Below, I provide this semester’s evaluation reports in their entirety. This information will help future students who are enrolled in my ISOM2010 course (or thinking to do so) to know what to expect and perhaps run away (i.e., ditch the course) while they can. For good measure, I’m including the evaluation report for the MBA course on digital marketing (ISOM5390) that I taught in this semester too.

[Quick tip for navigating the reports: Q11 is the instructor rating score; Q14 is about the weak points of the course/instructor.]

ISOM2010 L1:

Download (PDF, 130KB)

ISOM2010 L2:

Download (PDF, 134KB)

ISOM2010 L3:

Download (PDF, 129KB)

ISOM5390:

Download (PDF, 150KB)