Muutke küpsiste eelistusi

E-raamat: Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments: Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 - 27, 2015, Revised Contributions

Edited by , Edited by , Edited by
  • Formaat - PDF+DRM
  • Hind: 55,56 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

As the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments.

The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015, was to discuss experiences and methodological considerations when using crowdsourcing platforms to run human-centered experiments to test the effectiveness of visual representations. The inspiring Dagstuhl atmosphere fostered discussions and brought together researchers from different research directions. The papers provide information on crowdsourcing technology and experimental methodologies, comparisons between crowdsourcing and lab experiments, the use of crowdsourcing for visualisation, psychology, QoE and HCI empirical studies, and finally the nature of crowdworkers and their work, their motivation and demographic background, as well as the relationships among people forming the crowdsourcing community.

Evaluation in the Crowd: An Introduction
1(5)
Daniel Archambault
Helen C. Purchase
Tobias Hoßfeld
Crowdsourcing Versus the Laboratory: Towards Human-Centered Experiments Using the Crowd
6(21)
Ujwal Gadiraju
Sebastian Moller
Martin Nollenburg
Dietmar Saupe
Sebastian Egger-Lampl
Daniel Archambault
Brian Fisher
Understanding the Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing
27(43)
David Martin
Sheelagh Carpendale
Neha Gupta
Tobias Hoßfeld
Babak Naderi
Judith Redi
Ernestasia Siahaan
Ina Wechsung
Crowdsourcing Technology to Support Academic Research
70(26)
Matthias Hirth
Jason Jacques
Peter Rodgers
Ognjen Scekic
Michael Wybrow
Crowdsourcing for Information Visualization: Promises and Pitfalls
96(43)
Rita Borgo
Bongshin Lee
Benjamin Bach
Sara Fabrikant
Radu Jianu
Andreas Kerren
Stephen Kobourov
Fintan McGee
Luana Micallef
Tatiana von Landesberger
Katrin Ballweg
Stephan Diehl
Paolo Simonetto
Michelle Zhou
Cognitive Information Theories of Psychology and Applications with Visualization and HCI Through Crowdsourcing Platforms
139(15)
Darren J. Edwards
Linda T. Kaastra
Brian Fisher
Remco Chang
Min Chen
Crowdsourcing Quality of Experience Experiments
154(37)
Sebastian Egger-Lampl
Judith Redi
Tobias Hoßfeld
Matthias Hirth
Sebastian Moller
Babak Naderi
Christian Keimel
Dietmar Saupe
Author Index 191