[ View menu ]

June 13, 2014

Changing the riskiness of bets to make hot hands happen

Filed in Articles ,Ideas ,Research News
Subscribe to Decision Science News by Email (one email per week, easy unsubscribe)

SELF-CONSTRUCTED STREAKS IN GAMBLING SUCCESS AND FAILURE

hr

We came across this writeup (excuse the click-baity title) of a new paper by Juemin Xu and Nigel Harvey called “Carry on winning: The gamblers’ fallacy creates hot hand effects in online gambling” and found it to be fascinating. It’s one of those things you read and think “how could no one have thought of this before”?

ABSTRACT

People suffering from the hot-hand fallacy unreasonably expect winning streaks to continue whereas those suffering from the gamblers’ fallacy unreasonably expect losing streaks to reverse. We took 565,915 sports bets made by 776 online gamblers in 2010 and analyzed all winning and losing streaks up to a maximum length of six. People who won were more likely to win again (apparently because they chose safer odds than before) whereas those who lost were more likely to lose again (apparently because they chose riskier odds than before). However, selection of safer odds after winning and riskier ones after losing indicates that online sports gamblers expected their luck to reverse: they suffered from the gamblers’ fallacy. By believing in the gamblers’ fallacy, they created their own hot hands.

REFERENCE
Xu, Juemin and Nigel Harvey. (2014). Carry on winning: The gamblers’ fallacy creates hot hand effects in online gambling. Cognition, Volume 131, Issue 2, May 2014, Pages 173–180. [Full text and PDF free at the publisher’s site]

Photo credit: https://www.flickr.com/photos/jesscross/3169240519/

June 2, 2014

Preston McAfee joins Microsoft as Chief Economist

Filed in Gossip ,Research News
Subscribe to Decision Science News by Email (one email per week, easy unsubscribe)

STARTING TODAY

5807.Preston.jpg-250x0

 

Preston McAfee, former decade-long editor of the American Economic Review, former Caltech professor, and all around micro-economist extraordinaire, starts today as Chief Economist of Microsoft.

[TechNet] Microsoft hires Preston McAfee as chief economist

We at Decision Science News are excited to be publishing with Preston again, picking up on the research we did with Sid Suri and him back in the Yahoo Research days:

After Yahoo, Sid and I went to Micrsoft and Preston went to Google. We’re all very happy to be reunited!

May 28, 2014

If you make hiring or admissions decisions, read this

Filed in Ideas ,Research News
Subscribe to Decision Science News by Email (one email per week, easy unsubscribe)

MECHANICAL VERSUS CLINICAL DATA COMBINATION IN SELECTION AND ADMISSIONS DECISIONS: A META-ANALYSIS

al

The pink plastic alligator at Erasmus University Rotterdam says “Interview-based impressions belong in the trash can behind me.”

Is there something you’ve learned in your job that you wish you could tell everyone? We have something that’s well known for decades by decision-making researchers, and all but unknown in the outside world.

Here’s the deal. When hiring or making admissions decisions, impressions of a person from an interview are close to worthless. Hire on the most objective data you have. Even when people try to combine their impressions with data, they make worse decisions than by just following the data alone.

Don’t feel swayed by an interview. It’s not fair to the other candidates who are better on paper. They will most likely be better in practice.

Please see:

* This paper by Kuncel, Klieger, Connelly, and Ones: Mechanical versus clinical data combination in selection and admissions decisions: A meta-analysis.

ABSTRACT
In employee selection and academic admission decisions, holistic (clinical) data combination methods continue to be relied upon and preferred by practitioners in our field. This meta-analysis examined and compared the relative predictive power of mechanical methods versus holistic methods in predicting multiple work (advancement, supervisory ratings of performance, and training performance) and academic (grade point average) criteria. There was consistent and substantial loss of validity when data were combined holistically—even by experts who are knowledgeable about the jobs and organizations in question—across multiple criteria in work and academic settings. In predicting job performance, the difference between the validity of mechanical and holistic data combination methods translated into an improvement in prediction of more than 50%. Implications for evidence-based practice are discussed.

REFERENCE
Kuncel, N. R., Klieger, D. M., Connelly, B. S., and Ones, D. S. (2013). Mechanical versus clinical data combination in selection and admissions decisions: A meta-analysis. Journal of Applied Psychology, 98(6), 1060

* This paper by Highhouse Stubborn reliance on intuition and subjectivity in employee selection. Industrial and Organizational Psychology, 1(3), 333-342.

ABSTRACT
The focus of this article is on implicit beliefs that inhibit adoption of selection decision aids (e.g., paper-and-pencil tests, structured interviews, mechanical combination of predictors). Understanding these beliefs is just as important as understanding organizational constraints to the adoption of selection technologies and may be more useful for informing the design of successful interventions. One of these is the implicit belief that it is theoretically possible to achieve near-perfect precision in predicting performance on the job. That is, people have an inherent resistance to analytical approaches to selection because they fail to view selection as probabilistic and subject to error. Another is the implicit belief that prediction of human behavior is improved through experience. This myth of expertise results in an over-reliance on intuition and a reluctance to undermine one’s own credibility by using a selection decision aid.

REFERENCE
Highhouse, S. (2008). Stubborn reliance on intuition and subjectivity in employee selection. Industrial and Organizational Psychology, 1(3), 333-342.

* This paper by Highhouse and Kostek. Holistic assessment for selection and placement

ABSTRACT
Holism in assessment is a school of thought or belief system rather than a specific technique. It is based on the notion that assessment of future success requires taking into account the whole person. In its strongest form, individual test scores or measurement ratings are subordinate to expert diagnoses. Traditional standardized tests are seen as providing only limited snapshots of a person, and expert intuition is viewed as the only way to understand how attributes interact to create a complex whole. Expert intuition is used not only to gather information but also to properly execute data combination. Under the holism school, an expert combination of cues qualifies as a method or process of measurement. The holistic assessor views the assessment of personality and ability as an ideographic enterprise, wherein the uniqueness of the individual is emphasized and nomothetic generalizations are downplayed (see Allport, 1962). This belief system has been widely adopted in college admissions and is implicitly held by employers who rely exclusively on traditional employment interviews to make hiring decisions. Milder forms of holistic belief systems are also held by a sizable minority of organizational psychologists—ones who conduct managerial, executive, or special-operation assessments. In this chapter, the roots of holistic assessment for selection and placement decisions are reviewed and the applications of holistic assessment in college admissions and employee selection are discussed. Evidence and controversy surrounding holistic practices are examined, and the assumptions of the holistic school are evaluated. That the use of more standardized procedures over less standardized ones invariably enhances the scientific integrity of the assessment process is a conclusion of the chapter.

REFERENCE
Highhouse, Scott and Kostek, John A. (2013). Holistic assessment for selection and placement. Chapter in: APA handbook of testing and assessment in psychology, Vol. 1: Test theory and testing and assessment in industrial and organizational psychology. http://psycnet.apa.org/index.cfm?fa=search.displayRecord&UID=2012-22485-031

Feel free to post other references in the comments.

May 23, 2014

Beware the CRJ

Filed in Ideas
Subscribe to Decision Science News by Email (one email per week, easy unsubscribe)

ATTRIBUTION

jrc

There is a kind of plane known as a Canadair Regional Jet. CRJ for short.

We have been booked on CRJs 6 times in the past 3 years.

All were short hops.

All were to or from podunk airports.

All were American Eagle flights.

Five of those six flights were cancelled. Mechanical reasons.

We want to give advice.

We don’t know whether to advise avoiding short hops, podunk airports, American Eagle, or the CRJ.

Or some combination.

We want to blame the CRJ.

Beware the CRJ.

May 15, 2014

Einhorn award submissions invited for 2014

Filed in Conferences ,Research News
Subscribe to Decision Science News by Email (one email per week, easy unsubscribe)

THE HILLEL EINHORN NEW INVESTIGATOR AWARD FOR 2014

The Society for Judgment and Decision Making is inviting submissions for the Hillel Einhorn New Investigator Award. The purpose of this award is to encourage outstanding work by new researchers. Individuals are eligible if they have not yet completed their Ph.D. or if they have completed their Ph.D. within the last five years (on or after July 1, 2009). To be considered for the award, please submit a journal-style manuscript on any topic related to judgment and decision making.

In the case of co-authored papers, if the authors are all new investigators they can be considered jointly; otherwise, the new investigator(s) must be the primary author(s) and should be the primary source of ideas. Submissions in dissertation format will not be considered, but articles based on a dissertation are encouraged. Both reprints of published articles and manuscripts that have not yet been published are acceptable.

Submissions will be judged by a committee appointed by the Society. To be considered, submissions must be received by 30 June, 2014. The committee will announce the results to the participants by 14 September, 2014. The award will be announced and presented at the annual meeting of the Society for Judgment and Decision Making. The winner will be invited to give a presentation at that meeting. If the winner cannot obtain full funding from his/her own institution to attend the meeting, an application may be made to the Society for supplemental travel needs.

To make a submission, go to

http://www.sjdm.org/awards/einhorn.upload.html

May 9, 2014

SJDM Conference, Nov 21-24, 2014 Long Beach, CA. Deadline June 30.

Filed in SJDM ,SJDM-Conferences
Subscribe to Decision Science News by Email (one email per week, easy unsubscribe)

SOCIETY FOR JUDGMENT AND DECISION MAKING ANNUAL CONFERENCE

longbeach

The Society for Judgment and Decision Making (SJDM) invites abstracts for the 2014 conference (oral presentations, posters, and symposia) and the Einhorn New Investigator Award. The deadline for submissions is June 30, 2014. The conference will be held November 21-24, 2014 in Long Beach, California.

The call for abstracts is available at:
http://www.sjdm.org/programs/2014-cfp.html

May 2, 2014

Do non-compete agreements result in worse work?

Filed in Books ,Ideas ,Research News
Subscribe to Decision Science News by Email (one email per week, easy unsubscribe)

EXPERIMENTAL NON-COMPETES IN ONLINE LABOR MARKETS

twtb

The academic power couple On Amir and Orly Lobel report on a clever experiment on non-compete agreements  in a recent Harvard Business Review article:

We recruited 1,028 participants to complete an online task for pay. Half of them were asked to do a purely effort-based activity (searching matrices for numbers that added up to 10), and the other half, a creative activity (thinking of words closely associated with other words). Some subjects in each group were placed under restrictions that mimicked a noncompete agreement: They were told that although they would later be invited to perform another paid task, they’d be barred from accepting the same type of task. The remaining subjects were used as a control group and given no restrictions.

Sixty-one percent of the subjects in the noncompete group gave up on their task (thus forgoing payment), compared with only 41% in the control group. Among the subjects who completed the matrix task, people with noncompete conditions were twice as likely to make mistakes as people in the control group. Those who were restricted also skipped more items and spent less time on the task—further indications of low motivation.

The finding seems to fit the theme of Orly Lobel’s book Talent Wants to Be Free: Why We Should Learn to Love Leaks, Raids, and Free Riding. When the authors replaced the matrix task with the more enjoyable creative activity, the differences went away. As the authors say “Prior research had shown that in creative endeavors, people are primarily driven by intrinsic motivations. So it made sense that subjects working on the word associations would be less affected by a negative external incentive than people working on math tasks would be.

We are interested in manipulations that affect the amount and quality of work done in online labor markets, as well as the honesty of workers. Papers on this topic will be plentiful at the upcoming COBE (Crowdsourcing and Online Behavioral Experiments) conference, so if you are in the Silicon Valley area, stop on by.

April 26, 2014

A rough guide to spotting bad science

Filed in Ideas ,Research News
Subscribe to Decision Science News by Email (one email per week, easy unsubscribe)

HANDY PDF GIVES TWELVE TIPS

Sp_s
Click above to embiggen

We learned about the Compound Interest blog, a self-described “everyday exploration of chemical compounds” from this post entitled “A rough guide to spotting bad science“. It’s great and speaks for itself. Visit the link to dowload it in PDF form or to order a poster.

As our longtime readers know, bad science journalism is a pet peeve of ours. See our related post what can we do to defang bad science journalism?.

* REMDINDER: The Crowdsourcing and Online Behavioral Experiments submission deadline is Wednesday, April 30!

April 14, 2014

Second Annual Workshop on Crowdsourcing and Online Behavioral Experiments (COBE 2014)

Filed in Conferences
Subscribe to Decision Science News by Email (one email per week, easy unsubscribe)

COBE 2014. SUBMISSION DEADLINE APRIL 30, 2014

Save the date for COBE: June 8 2014

OVERVIEW

The World Wide Web has resulted in new and unanticipated avenues for conducting large-scale behavioral experiments. Crowdsourcing sites like Amazon Mechanical Turk, oDesk, and Taskcn, among others, have given researchers access to a large participant pool that operates around the clock. As a result, behavioral researchers in academia have turned to crowdsourcing sites in large numbers. Moreover, websites like eBay, Yelp and Reddit have become places where researchers can conduct field experiments. Companies like Microsoft, Facebook, Google and Yahoo! conduct hundreds of randomized experiments on a daily basis. We may be rapidly reaching a point where most behavioral experiments will be done online.

The main purpose of this workshop is to bring together researchers conducting behavioral experiments online to share new results, methods and best practices.

BASIC INFORMATION

Submission Deadline: April 30, 2014

  • Notification Date: May 13, 2014
  • Workshop Date: June 8, 2014, 4pm – 6pm
  • Location: Stanford University, Palo Alto, California. A workshop before the 15th ACM Conference on Electronic Commerce: http://www.sigecom.org/ec14/

TOPICS OF INTEREST:

Topics of interest for the workshop include but are not limited to:

  • Crowdsourcing
  • Online behavioral experiments
  • Online field experiments
  • Online natural or quasi-experiments
  • Online surveys
  • Human Computation

PAPER SUBMISSION:

Submit papers electronically by visiting https://www.easychair.org/conferences/?conf=cobe2014, logging in or creating an account, and clicking New Submission at the top left.

Submissions are non-archival, meaning contributors are free to publish their results subsequently in archival journals or conferences. There will be no published proceedings. Submissions should be 1-2 pages including references. Accepted papers will be presented as talks of roughly 20 minutes in length.

Organizing Committee:

Program Committee:

  • Yiling Chen, Harvard
  • Lydia Chilton, University of Washington
  • Sam Gosling, University of Texas, Austin
  • John Horton, NYU Stern School of Business
  • Panos Ipeirotis, NYU Stern School of Business
  • Eric Johnson, Columbia Business School
  • Edith Law, Harvard
  • Randall Lewis, Google
  • Andrew Mao, Harvard
  • Gabriele Paolacci, Erasmus University Rotterdam
  • David Reiley, Google
  • Andrew Stephen, University of Pittsburgh, Katz Graduate School of Business
  • Sean Taylor, Facebook

April 7, 2014

13th TIBER Symposium on Psychology and Economics (2014)

Filed in Conferences
Subscribe to Decision Science News by Email (one email per week, easy unsubscribe)

KEYNOTES BY RICHARD ZECKHAUSER AND SHANE FREDERICK

tlbr

Job Krijnen, Ilja van Beest, Rik Pieters, Jan Potters, and Marcel Zeelenberg write:

TIBER, the Tilburg Institute for Behavioral Economics Research, is happy to announce the 13 th Tiber Symposium on Psychology and Economics, to be held on August 22, 2014 at Tilburg University. The symposium aims to bring together Economists, Psychologists, Marketing researchers and others who work on Behavioral Decision Making, either in individual or interdependent settings. The symposium consists of two keynotes, a number of parallel sessions with presentations of about 20-30 minutes, and a poster session.

We are proud to have Richard Zeckhauser of Harvard University and Shane Frederick of Yale University as this year’s keynote speakers.

The goal of this series of symposia is to establish contact and discussion between researchers of the different fields. We look for empirical contributions from diverse fields, such as Individual Decision Making, Consumer Behavior, Bargaining, Social Dilemmas, Experimental Games, Emotions, Fairness and Justice, Rational Choice, and related subjects.

CALL FOR ABSTRACTS
If you would like to contribute to TIBER by presenting your research, we invite you to submit an abstract of max. 250 words via our website www.tilburguniversity.edu/tiber13. On the basis of these abstracts we will select presenters for the symposium.

IMPORTANT DATES
1st of April Call for abstracts
18th of May Deadline for submission of abstracts
1st of June Selection of speakers
22nd of August Symposium at Tilburg University

More information about the program of the symposium and the keynote speakers, as well as the location of the symposium and the registration forms will soon be available at

http://www.tilburguniversity.edu/research/institutes-and-research-groups/tiber/conferences/

If you have any questions regarding the symposium, please contact Job Krijnen
(j.m.t.krijnen@tilburguniversity.edu). Please use subject: ‘TIBER 13