Muutke küpsiste eelistusi

Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments: Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 27, 2015, Revised Contributions 1st ed. 2017 [Pehme köide]

Edited by , Edited by , Edited by
  • Formaat: Paperback / softback, 191 pages, kõrgus x laius: 235x155 mm, kaal: 3168 g, 15 Illustrations, black and white; VII, 191 p. 15 illus., 1 Paperback / softback
  • Sari: Lecture Notes in Computer Science 10264
  • Ilmumisaeg: 29-Sep-2017
  • Kirjastus: Springer International Publishing AG
  • ISBN-10: 3319664344
  • ISBN-13: 9783319664347
  • Pehme köide
  • Hind: 48,70 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Tavahind: 57,29 €
  • Säästad 15%
  • Raamatu kohalejõudmiseks kirjastusest kulub orienteeruvalt 2-4 nädalat
  • Kogus:
  • Lisa ostukorvi
  • Tasuta tarne
  • Tellimisaeg 2-4 nädalat
  • Lisa soovinimekirja
  • Formaat: Paperback / softback, 191 pages, kõrgus x laius: 235x155 mm, kaal: 3168 g, 15 Illustrations, black and white; VII, 191 p. 15 illus., 1 Paperback / softback
  • Sari: Lecture Notes in Computer Science 10264
  • Ilmumisaeg: 29-Sep-2017
  • Kirjastus: Springer International Publishing AG
  • ISBN-10: 3319664344
  • ISBN-13: 9783319664347
As the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments.

The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015, was to discuss experiences and methodological considerations when using crowdsourcing platforms to run human-centered experiments to test the effectiveness of visual representations. The inspiring Dagstuhl atmosphere fostered discussions and brought together researchers from different research directions. The papers provide information on crowdsourcing technology and experimental methodologies, comparisons between crowdsourcing and lab experiments, the use of crowdsourcing for visualisation, psychology, QoE and HCI empirical studies, and finally the nature of crowdworkers and their work, their motivation and demographic background, as well as the relationships among people forming the crowdsourcing community.

Evaluation in the Crowd: An Introduction
1(5)
Daniel Archambault
Helen C. Purchase
Tobias Hoßfeld
Crowdsourcing Versus the Laboratory: Towards Human-Centered Experiments Using the Crowd
6(21)
Ujwal Gadiraju
Sebastian Moller
Martin Nollenburg
Dietmar Saupe
Sebastian Egger-Lampl
Daniel Archambault
Brian Fisher
Understanding the Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing
27(43)
David Martin
Sheelagh Carpendale
Neha Gupta
Tobias Hoßfeld
Babak Naderi
Judith Redi
Ernestasia Siahaan
Ina Wechsung
Crowdsourcing Technology to Support Academic Research
70(26)
Matthias Hirth
Jason Jacques
Peter Rodgers
Ognjen Scekic
Michael Wybrow
Crowdsourcing for Information Visualization: Promises and Pitfalls
96(43)
Rita Borgo
Bongshin Lee
Benjamin Bach
Sara Fabrikant
Radu Jianu
Andreas Kerren
Stephen Kobourov
Fintan McGee
Luana Micallef
Tatiana von Landesberger
Katrin Ballweg
Stephan Diehl
Paolo Simonetto
Michelle Zhou
Cognitive Information Theories of Psychology and Applications with Visualization and HCI Through Crowdsourcing Platforms
139(15)
Darren J. Edwards
Linda T. Kaastra
Brian Fisher
Remco Chang
Min Chen
Crowdsourcing Quality of Experience Experiments
154(37)
Sebastian Egger-Lampl
Judith Redi
Tobias Hoßfeld
Matthias Hirth
Sebastian Moller
Babak Naderi
Christian Keimel
Dietmar Saupe
Author Index 191