Gazepoint Citations
We are frequently asked if we can provide examples of research papers in which the Gazepoint eye-tracking technologies are used. We are very happy to provide a shortlist of publications that we have found to date. If you are interested in using our best eye-tracking software for marketers in your research and don’t have the software yet, shop now or contact us to get started!
If you have published your research from your neuromarketing study that uses the Gazepoint system, please let us know and we will add a link to your work here! Our suggested reference to cite Gazepoint in your research is: Gazepoint (2021). GP3 Eye-Tracker. Retrieved from https://www.gazept.com
3151148
1
apa
50
date
year
451
https://www.gazept.com/wp-content/plugins/zotpress/
%7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3Afalse%2C%22meta%22%3A%7B%22request_last%22%3A350%2C%22request_next%22%3A50%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%22C8VT35L7%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Dondi%20et%20al.%22%2C%22parsedDate%22%3A%222024-04-01%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EDondi%2C%20P.%2C%20Sapuppo%2C%20S.%2C%20%26amp%3B%20Porta%2C%20M.%20%282024%29.%20Leyenes%3A%20A%20gaze-based%20text%20entry%20method%20using%20linear%20smooth%20pursuit%20and%20target%20speed.%20%3Ci%3EInternational%20Journal%20of%20Human-Computer%20Studies%3C%5C%2Fi%3E%2C%20%3Ci%3E184%3C%5C%2Fi%3E%2C%20103204.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.ijhcs.2023.103204%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.ijhcs.2023.103204%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Leyenes%3A%20A%20gaze-based%20text%20entry%20method%20using%20linear%20smooth%20pursuit%20and%20target%20speed%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Piercarlo%22%2C%22lastName%22%3A%22Dondi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Samuel%22%2C%22lastName%22%3A%22Sapuppo%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marco%22%2C%22lastName%22%3A%22Porta%22%7D%5D%2C%22abstractNote%22%3A%22Gaze-based%20writing%20is%20one%20of%20the%20most%20widespread%20eye%20tracking%20applications%20for%20human%5Cu2013computer%20interaction.%20While%20eye%20tracking%20communication%20has%20traditionally%20been%20employed%20as%20an%20assistive%20technology%2C%20declining%20prices%20of%20eye%20trackers%20now%20make%20it%20a%20feasible%20alternative%20to%20keyboards%20or%20touchscreens%20in%20many%20contexts%20%28for%20example%2C%20the%20interaction%20with%20public%20info%20points%29.%20In%20this%20paper%20we%20propose%20Leyenes%2C%20a%20text%20entry%20method%20based%20on%20smooth%20pursuit%2C%20a%20natural%20eye%20movement%20that%20occurs%20when%20the%20gaze%20follows%20a%20moving%20target.%20Our%20approach%20requires%20no%20explicit%20calibration%20by%20the%20user%2C%20allowing%20for%20more%20spontaneous%20interaction%20and%20enabling%20eye%20input%20even%20when%20calibration%20is%20difficult%20to%20achieve%20or%20maintain.%20To%20the%20best%20of%20our%20knowledge%2C%20Leyenes%20is%20the%20first%20text%20entry%20technique%20based%20on%20smooth%20pursuit%20that%20considers%20both%20%28approximate%29%20gaze%20speed%20and%20position%20and%20employs%20a%20linear%20interface%20instead%20of%20the%20more%20common%20circular%20layouts.%20The%20results%20of%20the%20user%20study%20we%20conducted%20show%20that%20the%20proposed%20solution%20is%20slow%20but%20robust%2C%20with%20a%20very%20low%20error%20rate%2C%20which%20makes%20it%20particularly%20suitable%20for%20extemporaneous%20writing%20of%20short%20text.%22%2C%22date%22%3A%222024-04-01%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.ijhcs.2023.103204%22%2C%22ISSN%22%3A%221071-5819%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS1071581923002136%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T21%3A45%3A48Z%22%7D%7D%2C%7B%22key%22%3A%22Y7SP8ZNQ%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Yin%20and%20Neyens%22%2C%22parsedDate%22%3A%222024-02-01%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EYin%2C%20R.%2C%20%26amp%3B%20Neyens%2C%20D.%20M.%20%282024%29.%20Examining%20how%20information%20presentation%20methods%20and%20a%20chatbot%20impact%20the%20use%20and%20effectiveness%20of%20electronic%20health%20record%20patient%20portals%3A%20An%20exploratory%20study.%20%3Ci%3EPatient%20Education%20and%20Counseling%3C%5C%2Fi%3E%2C%20%3Ci%3E119%3C%5C%2Fi%3E%2C%20108055.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.pec.2023.108055%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.pec.2023.108055%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Examining%20how%20information%20presentation%20methods%20and%20a%20chatbot%20impact%20the%20use%20and%20effectiveness%20of%20electronic%20health%20record%20patient%20portals%3A%20An%20exploratory%20study%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rong%22%2C%22lastName%22%3A%22Yin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22David%20M.%22%2C%22lastName%22%3A%22Neyens%22%7D%5D%2C%22abstractNote%22%3A%22Objectives%5CnExamining%20information%20presentation%20strategies%20that%20may%20facilitate%20patient%20education%20through%20patient%20portals%20is%20important%20for%20effective%20health%20education.%5CnMethods%5CnA%20randomized%20exploratory%20study%20evaluated%20information%20presentation%20%28text%20or%20videos%29%20and%20a%20chatbot%20in%20patient%20education%20and%20examined%20several%20performance%20and%20outcome%20variables%20%28e.g.%2C%20search%20duration%2C%20Decisional%20Conflict%20Scale%2C%20and%20eye-tracking%20measures%29%2C%20along%20with%20a%20simple%20descriptive%20qualitative%20content%20analysis%20of%20the%20transcript%20of%20chatbot.%5CnResults%5CnOf%20the%2092%20participants%2C%20those%20within%20the%20text%20conditions%20%28n%5Cu00a0%3D%5Cu00a046%2C%20p%5Cu00a0%3C%5Cu00a00.001%29%2C%20had%20chatbot%20experiences%20%28B%20%3D%5Cu221274.85%2C%20p%5Cu00a0%3D%5Cu00a00.046%29%2C%20knew%20someone%20with%20IBD%20%28B%20%3D%5Cu221298.66%2C%20p%5Cu00a0%3D%5Cu00a00.039%29%2C%20and%20preferred%20to%20engage%20in%20medical%20decision-making%20%28B%20%3D102.32%2C%20p%5Cu00a0%3D%5Cu00a00.006%29%20were%20more%20efficient%20in%20information-searching.%20Participants%20with%20videos%20spent%20longer%20in%20information-searching%20%28mean%3D666.5%20%28SD%3D171.6%29%20VS%20480.3%20%28SD%3D159.5%29%20seconds%2C%20p%5Cu00a0%3C%5Cu00a00.001%29%20but%20felt%20more%20informed%20%28mean%20score%3D18.8%20%28SD%3D17.6%29%20VS%2027.4%20%28SD%3D18.9%29%2C%20p%5Cu00a0%3D%5Cu00a00.027%29.%20The%20participants%5Cu2019%20average%20eye%20fixation%20duration%20with%20videos%20was%20significantly%20higher%20%28mean%3D%20473.8%5Cu00a0ms%2C%20SD%3D52.9%2C%20p%5Cu00a0%3C%5Cu00a00.001%29.%5CnConclusions%5CnParticipants%20in%20video%20conditions%20were%20less%20efficient%20but%20more%20effective%20in%20information%20seeking.%20Exploring%20the%20trade-offs%20between%20efficiency%20and%20effectiveness%20for%20user%20interface%20designs%20is%20important%20to%20appropriately%20deliver%20education%20within%20patient%20portals.%5CnPractice%20implications%5CnThis%20study%20suggests%20that%20user%20interface%20designs%20and%20chatbots%20impact%20health%20information%5Cu2019s%20efficiency%20and%20effectiveness.%22%2C%22date%22%3A%222024-02-01%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.pec.2023.108055%22%2C%22ISSN%22%3A%220738-3991%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS0738399123004366%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T19%3A16%3A12Z%22%7D%7D%2C%7B%22key%22%3A%22NHUIARQC%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Cie%5Cu015bla%20and%20Dzie%5Cu0144kowski%22%2C%22parsedDate%22%3A%222023-12-20%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ECie%26%23x15B%3Bla%2C%20M.%2C%20%26amp%3B%20Dzie%26%23x144%3Bkowski%2C%20M.%20%282023%29.%20AN%20ANALYSIS%20OF%20THE%20IMPLEMENTATION%20OF%20ACCESSIBILITY%20TOOLS%20ON%20WEBSITES.%20%3Ci%3EInformatyka%2C%20Automatyka%2C%20Pomiary%20w%20Gospodarce%20i%20Ochronie%20%26%23x15A%3Brodowiska%3C%5C%2Fi%3E%2C%20%3Ci%3E13%3C%5C%2Fi%3E%284%29%2C%2051%26%23x2013%3B56.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.35784%5C%2Fiapgos.4459%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.35784%5C%2Fiapgos.4459%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22AN%20ANALYSIS%20OF%20THE%20IMPLEMENTATION%20OF%20ACCESSIBILITY%20TOOLS%20ON%20WEBSITES%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marcin%22%2C%22lastName%22%3A%22Cie%5Cu015bla%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mariusz%22%2C%22lastName%22%3A%22Dzie%5Cu0144kowski%22%7D%5D%2C%22abstractNote%22%3A%22The%20websites%20of%20higher%20education%20institutions%2C%20due%20to%20the%20fact%20that%20they%20are%20addressed%20to%20multiple%20stakeholder%20groups%2C%20not%20only%20need%20to%20have%20an%20appropriately%20designed%20information%20structure%20but%20must%20also%20be%20useful.%20Additionally%2C%20in%20the%20case%20of%20public%20universities%2C%20their%20services%20are%20expected%20to%20be%20accessible%20to%20the%20widest%20possible%20audience%2C%20especially%20for%20people%20with%20disabilities.%20The%20accessibility%20tools%20used%20on%20websites%20should%20be%20quickly%20located%2C%20easily%20identifiable%20and%20user-friendly.%20So%20far%2C%20no%20standards%20have%20been%20developed%20regarding%20these%20issues%2C%20and%20therefore%2C%20there%20are%20various%20solutions%20on%20the%20web.%20The%20objective%20of%20this%20study%20is%20to%20analyze%20various%20implementations%20of%20accessibility%20tools%20on%20university%20websites%20in%20terms%20of%20their%20location%2C%20form%20of%20presentation%20and%20ways%20that%20enable%20access%20to%20them.%20A%20study%20was%20conducted%20in%20which%20web%20interfaces%20were%20evaluated%20with%20the%20participation%20of%20users.%20The%20experiment%20consisted%20of%20two%20parts%3A%20the%20first%20one%20used%20the%20eye%20tracking%20technique%2C%20whereas%20in%20the%20second%20one%2C%20a%20survey%20was%20conducted.%20The%20research%20material%20was%20prototypes%20of%20websites%20from%20four%20different%20universities.%20Each%20website%20had%20two%20versions%20differing%20in%20implementation%20of%20accessibility%20tools.%20In%20the%20study%2C%2035%20participants%20were%20divided%20into%20two%20groups%20of%20people.%20Each%20group%20was%20shown%20one%20of%20the%20two%20sets%20of%20website%20prototypes%20and%20the%20users%20were%20tasked%20with%20finding%20and%20activating%20a%5Cu00a0specific%20accessibility%20tool.%20After%20exploring%20the%20websites%2C%20each%20participant%20completed%20a%20questionnaire%20that%20pertained%20to%20their%20opinions%20regarding%20aspects%20such%20as%20appearance%2C%20placement%20and%20a%20way%20to%20access%20tools%20dedicated%20to%20people%20with%20disabilities.%20The%20obtained%20data%2C%20processed%20to%20the%20form%20of%20heatmaps%20and%20fixation%20maps%2C%20were%20subjected%20to%20a%20qualitative%20analysis.%20The%20survey%20results%20and%20eye%20tracking%20data%20were%20analyzed%20quantitatively.%20On%20the%20basis%20of%20performed%20analyzes%20it%20can%20be%20concluded%20that%20the%20following%20factors%20have%20an%20impact%20on%20the%20reduction%20in%20efficiency%20and%20productivity%20of%20users%3A%20placement%20of%20accessibility%20tools%20on%20university%20websites%20in%20a%20place%20other%20than%20the%20upper%20right%20corner%2C%20an%20indirect%20access%20to%20these%20tools%20or%20their%20non-standard%20appearance.%22%2C%22date%22%3A%222023-12-20%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.35784%5C%2Fiapgos.4459%22%2C%22ISSN%22%3A%222391-6761%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fph.pollub.pl%5C%2Findex.php%5C%2Fiapgos%5C%2Farticle%5C%2Fview%5C%2F4459%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T21%3A51%3A24Z%22%7D%7D%2C%7B%22key%22%3A%226TIIY858%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Khairunnisa%20and%20Sari%22%2C%22parsedDate%22%3A%222023-12-18%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EKhairunnisa%2C%20G.%2C%20%26amp%3B%20Sari%2C%20H.%20%282023%29.%20Eye%20Tracking-based%20Analysis%20of%20Customer%20Interest%20on%20The%20Effectiveness%20of%20Eco-friendly%20Product%20Advertising%20Content.%20%3Ci%3EJurnal%20Optimasi%20Sistem%20Industri%3C%5C%2Fi%3E%2C%20%3Ci%3E22%3C%5C%2Fi%3E%2C%20153%26%23x2013%3B164.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.25077%5C%2Fjosi.v22.n2.p153-164.2023%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.25077%5C%2Fjosi.v22.n2.p153-164.2023%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Eye%20Tracking-based%20Analysis%20of%20Customer%20Interest%20on%20The%20Effectiveness%20of%20Eco-friendly%20Product%20Advertising%20Content%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ghalda%22%2C%22lastName%22%3A%22Khairunnisa%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hasrini%22%2C%22lastName%22%3A%22Sari%22%7D%5D%2C%22abstractNote%22%3A%22Amid%20the%20escalating%20environmental%20crisis%20that%20has%20prompted%20consumers%20to%20adopt%20eco-friendly%20lifestyles%2C%20the%20popularity%20of%20eco-friendly%20personal%20care%20products%20is%20increasing%20significantly.%20Nevertheless%2C%20marketing%20these%20products%20presents%20challenges%20that%20include%20inadequate%20product%20information%2C%20perceived%20unaffordable%20prices%2C%20and%20relatively%20low%20consumer%20trust.%20These%20challenges%20present%20an%20opportunity%20for%20the%20marketing%20field%20to%20increase%20consumer%20interest%2C%20particularly%20through%20advertising%2C%20an%20important%20medium%20for%20disseminating%20product%20information.%20Recognizing%20the%20importance%20of%20advertising%20components%20in%20influencing%20consumer%20preferences%2C%20this%20study%20uses%20eye-tracking%20to%20identify%20critical%20elements%20in%20promoting%20eco-friendly%20personal%20care%20products.%20The%20components%20examined%20include%20information%20on%20environmental%20and%20personal%20benefits%2C%20the%20presence%20or%20absence%20of%20price%20information%2C%20and%20the%20presentation%20of%20an%20environmental%20label%20%28logo%20and%20text%29%20in%20advertising.%20Each%20of%20the%2043%20participants%20is%20confronted%20with%20eight%20carefully%20crafted%20advertising%20stimuli.%20The%20results%20of%20the%20study%20highlight%20the%20significant%20influence%20of%20clear%20benefits%20and%20price%20information%20on%20consumer%20preferences%2C%20while%20indicating%20that%20eco-label%20display%20does%20not%20have%20a%20significant%20impact%20on%20consumer%20preference.%20This%20research%20is%20intended%20to%20serve%20as%20a%20source%20of%20actionable%20marketing%20strategies%20and%20is%20intended%20to%20help%20promote%20eco-friendly%20products%20and%20increase%20consumer%20interest%20through%20targeted%20and%20effective%20advertising.%22%2C%22date%22%3A%222023-12-18%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.25077%5C%2Fjosi.v22.n2.p153-164.2023%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T21%3A49%3A30Z%22%7D%7D%2C%7B%22key%22%3A%222NMZSC87%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Chv%5Cu00e1tal%20et%20al.%22%2C%22parsedDate%22%3A%222023-12-11%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EChv%26%23xE1%3Btal%2C%20R.%2C%20Slez%26%23xE1%3Bkov%26%23xE1%3B%2C%20J.%2C%20%26amp%3B%20Popelka%2C%20S.%20%282023%29.%20Analysis%20of%20problem-solving%20strategies%20for%20the%20development%20of%20geometric%20imagination%20using%20eye-tracking.%20%3Ci%3EEducation%20and%20Information%20Technologies%3C%5C%2Fi%3E.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs10639-023-12395-z%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs10639-023-12395-z%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Analysis%20of%20problem-solving%20strategies%20for%20the%20development%20of%20geometric%20imagination%20using%20eye-tracking%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Roman%22%2C%22lastName%22%3A%22Chv%5Cu00e1tal%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jana%22%2C%22lastName%22%3A%22Slez%5Cu00e1kov%5Cu00e1%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stanislav%22%2C%22lastName%22%3A%22Popelka%22%7D%5D%2C%22abstractNote%22%3A%22In%20the%20realm%20of%20mathematics%20education%2C%20geometry%20problems%20assume%20a%20pivotal%20role%20by%20fostering%20abstract%20thinking%2C%20establishing%20a%20connection%20between%20theory%20and%20practice%2C%20and%20offering%20a%20tangible%20portrayal%20of%20reality.%20This%20study%20focuses%20on%20comprehending%20problem-solving%20methodologies%20by%20observing%20the%20eye%20movements%20of%2045%20primary%20and%20multi-year%20grammar%20school%20pupils%2C%20aged%2011%20to%2014%2C%20as%20they%20tackled%20pictorial%20geometry%20problems%20without%20computation.%20The%20utilization%20of%20eye-tracking%20technology%2C%20specifically%20the%20OGAMA%20tool%2C%20was%20essential%20in%20unveiling%20the%20nuanced%20strategies%20employed%20by%20students.%20Visual%20attention%20metrics%20were%20determined%20through%20fixations%20on%20predefined%20areas%20of%20interest%2C%20identified%20using%20the%20ScanGraph%20tool.%20Through%20an%20analysis%20of%20eye%20movements%2C%20participants%20were%20categorized%20into%20three%20distinct%20groups%20based%20on%20their%20problem-solving%20strategies.%20This%20categorization%20facilitated%20an%20exploration%20of%20the%20correlation%20between%20the%20chosen%20strategy%20and%20the%20success%20rate%20in%20solving%20geometry%20problems%20without%20computational%20aids.%20The%20findings%20underscore%20the%20imperative%20for%20continued%20investigation%20into%20strategies%20for%20solving%20geometry%20problems%20without%20computation.%20Additionally%2C%20the%20research%20aims%20to%20broaden%20its%20scope%20by%20delving%20into%20the%20metacognitive%20strategies%20applied%20in%20solving%20imaginative%20geometric%20tasks.%22%2C%22date%22%3A%222023-12-11%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1007%5C%2Fs10639-023-12395-z%22%2C%22ISSN%22%3A%221573-7608%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs10639-023-12395-z%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T21%3A48%3A51Z%22%7D%7D%2C%7B%22key%22%3A%222THFVLQK%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Zhang%20et%20al.%22%2C%22parsedDate%22%3A%222023-12%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EZhang%2C%20C.%2C%20Tian%2C%20C.%2C%20Han%2C%20T.%2C%20Li%2C%20H.%2C%20Feng%2C%20Y.%2C%20Chen%2C%20Y.%2C%20Proctor%2C%20R.%20W.%2C%20%26amp%3B%20Zhang%2C%20J.%20%282023%29.%20%3Ci%3EEvaluation%20of%20Infrastructure-based%20Warning%20System%20on%20Driving%20Behaviors%20%26%23x2013%3B%20A%20Roundabout%20Study%3C%5C%2Fi%3E.%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Evaluation%20of%20Infrastructure-based%20Warning%20System%20on%20Driving%20Behaviors%20%5Cu2013%20A%20Roundabout%20Study%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Cong%22%2C%22lastName%22%3A%22Zhang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chi%22%2C%22lastName%22%3A%22Tian%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tianfang%22%2C%22lastName%22%3A%22Han%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hang%22%2C%22lastName%22%3A%22Li%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yiheng%22%2C%22lastName%22%3A%22Feng%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yunfeng%22%2C%22lastName%22%3A%22Chen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Robert%20W%22%2C%22lastName%22%3A%22Proctor%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jiansong%22%2C%22lastName%22%3A%22Zhang%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%22Dec%202023%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T21%3A53%3A24Z%22%7D%7D%2C%7B%22key%22%3A%22N9KCE6KJ%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Sun%20et%20al.%22%2C%22parsedDate%22%3A%222023-12%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ESun%2C%20L.%2C%20Zhang%2C%20M.%2C%20Qiu%2C%20Y.%2C%20%26amp%3B%20Zhang%2C%20C.%20%282023%29.%20Effects%20of%20Sleep%20Deprivation%20and%20Hazard%20Types%20on%20the%20Visual%20Search%20Patterns%20and%20Hazard%20Response%20Times%20of%20Taxi%20Drivers.%20%3Ci%3EBehavioral%20Sciences%3C%5C%2Fi%3E%2C%20%3Ci%3E13%3C%5C%2Fi%3E%2812%29%2C%201005.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.3390%5C%2Fbs13121005%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.3390%5C%2Fbs13121005%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Effects%20of%20Sleep%20Deprivation%20and%20Hazard%20Types%20on%20the%20Visual%20Search%20Patterns%20and%20Hazard%20Response%20Times%20of%20Taxi%20Drivers%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Long%22%2C%22lastName%22%3A%22Sun%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Meiqi%22%2C%22lastName%22%3A%22Zhang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yuanbo%22%2C%22lastName%22%3A%22Qiu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Changlu%22%2C%22lastName%22%3A%22Zhang%22%7D%5D%2C%22abstractNote%22%3A%22The%20present%20study%20attempted%20to%20explore%20the%20effects%20of%20sleep%20deprivation%20on%20the%20visual%20search%20patterns%20and%20hazard%20response%20times%20of%20taxi%20drivers%20when%20they%20encountered%20different%20types%20of%20hazards.%20A%20two%20%28driver%20groups%3A%20sleep%20deprivation%20or%20control%29%20%5Cu00d7%20two%20%28hazard%20types%3A%20covert%20hazard%20or%20overt%20hazard%29%20mixed%20experimental%20design%20was%20employed.%20A%20total%20of%2060%20drivers%20were%20recruited%2C%20half%20of%20whom%20were%20in%20the%20sleep-deprived%20group%20and%20half%20of%20whom%20were%20in%20the%20control%20group.%20A%20validated%20video-based%20hazard%20perception%20test%20that%20either%20contained%20covert%20hazards%20%2812%20video%20clips%29%20or%20overt%20hazards%20%2812%20video%20clips%29%20filmed%20from%20the%20drivers%5Cu2019%20perspective%20was%20presented%20to%20participants.%20Participants%20were%20instructed%20to%20click%20the%20left%20mouse%20button%20quickly%20once%20they%20detected%20a%20potentially%20dangerous%20situation%20that%20could%20lead%20to%20an%20accident.%20Participants%5Cu2019%20response%20time%20and%20eye%20movements%20relative%20to%20the%20hazards%20were%20recorded.%20The%20sleep-deprived%20group%20had%20a%20significantly%20longer%20response%20time%20and%20took%20a%20longer%20time%20to%20first%20fixate%20on%20covert%20hazards%20than%20the%20control%20group%2C%20while%20they%20had%20a%20shorter%20response%20time%20to%20overt%20hazards%20than%20the%20control%20group.%20The%20first%20fixation%20duration%20of%20sleep-deprived%20drivers%20was%20longer%20than%20that%20of%20the%20control%20group%20for%20overt%20hazards%2C%20while%20the%20duration%20of%20the%20first%20fixation%20of%20the%20two%20driver%20groups%20was%20similar%20for%20covert%20hazards.%20Sleep%20deprivation%20affects%20the%20visual%20search%20patterns%20and%20response%20times%20to%20hazards%2C%20and%20the%20adverse%20effects%20of%20sleep%20deprivation%20were%20worse%20in%20relation%20to%20covert%20hazards.%20The%20findings%20have%20some%20implications%20for%20classifying%20and%20evaluating%20high-risk%20taxi%20drivers%20whose%20hazard%20perception%20ability%20might%20be%20affected%20by%20insufficient%20sleep.%22%2C%22date%22%3A%222023%5C%2F12%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.3390%5C%2Fbs13121005%22%2C%22ISSN%22%3A%222076-328X%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.mdpi.com%5C%2F2076-328X%5C%2F13%5C%2F12%5C%2F1005%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T21%3A47%3A45Z%22%7D%7D%2C%7B%22key%22%3A%22H3YPELFX%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Mok%20et%20al.%22%2C%22parsedDate%22%3A%222023-12%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EMok%2C%20S.%2C%20Park%2C%20S.%2C%20%26amp%3B%20Whang%2C%20M.%20%282023%29.%20Examining%20the%20Impact%20of%20Digital%20Human%20Gaze%20Expressions%20on%20Engagement%20Induction.%20%3Ci%3EBiomimetics%3C%5C%2Fi%3E%2C%20%3Ci%3E8%3C%5C%2Fi%3E%288%29%2C%20610.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.3390%5C%2Fbiomimetics8080610%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.3390%5C%2Fbiomimetics8080610%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Examining%20the%20Impact%20of%20Digital%20Human%20Gaze%20Expressions%20on%20Engagement%20Induction%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Subin%22%2C%22lastName%22%3A%22Mok%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sung%22%2C%22lastName%22%3A%22Park%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mincheol%22%2C%22lastName%22%3A%22Whang%22%7D%5D%2C%22abstractNote%22%3A%22With%20advancements%20in%20technology%2C%20digital%20humans%20are%20becoming%20increasingly%20sophisticated%2C%20with%20their%20application%20scope%20widening%20to%20include%20interactions%20with%20real%20people.%20However%2C%20research%20on%20expressions%20that%20facilitate%20natural%20engagement%20in%20interactions%20between%20real%20people%20and%20digital%20humans%20is%20scarce.%20With%20this%20study%2C%20we%20aimed%20to%20examine%20the%20differences%20in%20user%20engagement%20as%20measured%20by%20subjective%20evaluations%2C%20eye%20tracking%2C%20and%20electroencephalogram%20%28EEG%29%20responses%20relative%20to%20different%20gaze%20expressions%20in%20various%20conversational%20contexts.%20Conversational%20situations%20were%20categorized%20as%20face-to-face%2C%20face-to-video%2C%20and%20digital%20human%20interactions%2C%20with%20gaze%20expressions%20segmented%20into%20eye%20contact%20and%20gaze%20avoidance.%20Story%20stimuli%20incorporating%20twelve%20sentences%20verified%20to%20elicit%20positive%20and%20negative%20emotional%20responses%20were%20employed%20in%20the%20experiments%20after%20validation.%20A%20total%20of%2045%20participants%20%2831%20females%20and%2014%20males%29%20underwent%20stimulation%20through%20positive%20and%20negative%20stories%20while%20exhibiting%20eye%20contact%20or%20gaze%20avoidance%20under%20each%20of%20the%20three%20conversational%20conditions.%20Engagement%20was%20assessed%20using%20subjective%20evaluation%20metrics%20in%20conjunction%20with%20measures%20of%20the%20subjects%5Cu2019%20gaze%20and%20brainwave%20activity.%20The%20findings%20revealed%20engagement%20disparities%20between%20the%20face-to-face%20and%20digital-human%20conversation%20conditions.%20Notably%2C%20only%20positive%20stimuli%20elicited%20variations%20in%20engagement%20based%20on%20gaze%20expression%20across%20different%20conversation%20conditions.%20Gaze%20analysis%20corroborated%20the%20engagement%20differences%2C%20aligning%20with%20prior%20research%20on%20social%20sensitivity%2C%20but%20only%20in%20response%20to%20positive%20stimuli.%20This%20research%20departs%20from%20traditional%20studies%20of%20un-natural%20interactions%20with%20digital%20humans%2C%20focusing%20instead%20on%20interactions%20with%20digital%20humans%20designed%20to%20mimic%20the%20appearance%20of%20real%20humans.%20This%20study%20demonstrates%20the%20potential%20for%20gaze%20expression%20to%20induce%20engagement%2C%20regardless%20of%20the%20human%20or%20digital%20nature%20of%20the%20conversational%20dyads.%22%2C%22date%22%3A%222023%5C%2F12%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.3390%5C%2Fbiomimetics8080610%22%2C%22ISSN%22%3A%222313-7673%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.mdpi.com%5C%2F2313-7673%5C%2F8%5C%2F8%5C%2F610%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T21%3A46%3A43Z%22%7D%7D%2C%7B%22key%22%3A%223IPDR29D%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Chang%20et%20al.%22%2C%22parsedDate%22%3A%222023-12%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EChang%2C%20Y.-C.%2C%20Gandi%2C%20N.%2C%20Shin%2C%20K.%2C%20Mun%2C%20Y.-J.%2C%20Driggs-Campbell%2C%20K.%2C%20%26amp%3B%20Kim%2C%20J.%20%282023%29.%20Specifying%20Target%20Objects%20in%20Robot%20Teleoperation%20Using%20Speech%20and%20Natural%20Eye%20Gaze.%20%3Ci%3E2023%20IEEE-RAS%2022nd%20International%20Conference%20on%20Humanoid%20Robots%20%28Humanoids%29%3C%5C%2Fi%3E%2C%201%26%23x2013%3B7.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FHumanoids57100.2023.10375186%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FHumanoids57100.2023.10375186%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Specifying%20Target%20Objects%20in%20Robot%20Teleoperation%20Using%20Speech%20and%20Natural%20Eye%20Gaze%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yu-Chen%22%2C%22lastName%22%3A%22Chang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nitish%22%2C%22lastName%22%3A%22Gandi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kazuki%22%2C%22lastName%22%3A%22Shin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ye-Ji%22%2C%22lastName%22%3A%22Mun%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Katherine%22%2C%22lastName%22%3A%22Driggs-Campbell%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Joohyung%22%2C%22lastName%22%3A%22Kim%22%7D%5D%2C%22abstractNote%22%3A%22Current%20approaches%20in%20robot%20teleoperation%20often%20require%20significant%20mental%20and%20physical%20effort.%20In%20this%20study%2C%20we%20propose%20a%20new%20intent%20detection%20framework%20to%20teleoperate%20robotic%20arms%20based%20on%20human%20speech%20and%20natural%20eye%20gaze.%20Our%20framework%20applies%20instance%20segmentation%20on%20the%20robot%27s%20camera%20image%20and%20predicts%20the%20human%27s%20intended%20object%20through%20matching%20eye-gaze%20data%2C%20instance%20masks%2C%20instance%20classes%2C%20and%20transcribed%20words.%20Our%20experiment%20results%20show%20a%20prediction%20accuracy%20between%2090.7%25%20and%2098.6%25%2C%20including%20cases%20when%20the%20target%20objects%20are%20duplicated%20or%20occluded.%20The%20prediction%20accuracy%20of%20the%20combination%20of%20eye%20gaze%20and%20speech%20inputs%20outperformed%20the%20prediction%20accuracy%20of%20eye%20gaze%20input%20only%2C%20between%2079.9%25%20and%2089.2%25%2C%20and%20speech%20input%20only%2C%20between%2025.3%25%20and%2071.6%25.%20Moreover%2C%20we%20observe%20that%20eye%20gaze%20input%20has%20a%20greater%20importance%20than%20speech%20input%20in%20improving%20prediction%20accuracy%20when%20two%20duplicated%20target%20objects%20are%20present%20in%20the%20scene.%20Our%20results%20from%20NASA%20TLX%20questionnaires%20show%20that%20teleoperating%20the%20robotic%20arms%20with%20our%20proposed%20framework%20requires%20little%20effort%20including%20cases%20when%20the%20target%20objects%20are%20duplicated%20or%20occluded.%22%2C%22date%22%3A%222023-12%22%2C%22proceedingsTitle%22%3A%222023%20IEEE-RAS%2022nd%20International%20Conference%20on%20Humanoid%20Robots%20%28Humanoids%29%22%2C%22conferenceName%22%3A%222023%20IEEE-RAS%2022nd%20International%20Conference%20on%20Humanoid%20Robots%20%28Humanoids%29%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FHumanoids57100.2023.10375186%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fabstract%5C%2Fdocument%5C%2F10375186%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-08T19%3A20%3A36Z%22%7D%7D%2C%7B%22key%22%3A%22835LZXU7%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Kowalewski%20and%20Williamson%22%2C%22parsedDate%22%3A%222023-12%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EKowalewski%2C%20S.%20J.%2C%20%26amp%3B%20Williamson%2C%20B.%20%282023%29.%20Fostering%20Advocacy%2C%20Developing%20Empathetic%20UX%20Bricoleurs%3A%20Ongoing%20Programmatic%20Assessment%20and%20Responsive%20Curriculum%20Design.%20%3Ci%3EIEEE%20Transactions%20on%20Professional%20Communication%3C%5C%2Fi%3E%2C%20%3Ci%3E66%3C%5C%2Fi%3E%284%29%2C%20382%26%23x2013%3B396.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FTPC.2023.3320530%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FTPC.2023.3320530%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Fostering%20Advocacy%2C%20Developing%20Empathetic%20UX%20Bricoleurs%3A%20Ongoing%20Programmatic%20Assessment%20and%20Responsive%20Curriculum%20Design%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Scott%20J.%22%2C%22lastName%22%3A%22Kowalewski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bill%22%2C%22lastName%22%3A%22Williamson%22%7D%5D%2C%22abstractNote%22%3A%22Introduction%3A%20As%20a%20field%2C%20we%20have%20tended%20to%20look%20at%20user-experience%20design%20%28UXD%29%20as%20a%20data-driven%20design%20process%2C%20anchored%20by%20usability%20studies%2C%20and%20anchored%20in%20fulfilling%20user%20needs%20and%20expectations.%20How%20then%20might%20technical%20and%20professional%20communication%20%28TPC%29%20curricula%20respond%20to%20evolving%20trends%20in%20user-experience%20%28UX%29%20scholarship%20and%20pedagogy%3F%20About%20the%20case%3A%20Addressing%20this%20question%2C%20we%20share%20our%20programmatic%20journey%2C%20a%20teaching%20case%20that%20represents%20more%20than%20a%20decade%20of%20reflection%20and%20evolution%2C%20culminating%20in%20the%20launch%20of%20a%20redesigned%20major%20and%20a%20UXD%20minor%20in%20a%20stand-alone%20department%20at%20a%20regional%2C%20primarily%20undergraduate%20teaching-focused%20university.%20Situating%20the%20case%3A%20Our%20programmatic%20identity%20began%20to%20shift%20toward%20a%20designer%20mindset%20that%20embraced%20three%20core%20frames%20for%20professional%20action%5Cu2013information%20design%2C%20problem%20solving%2C%20and%20civic%20engagement%5Cu2014and%20three%20complementary%20design%20tenets%5Cu2014empathy%2C%20advocacy%2C%20and%20bricolage.%20Methods%5C%2Fapproach%3A%20To%20better%20understand%20this%20shift%2C%20we%20recognized%20the%20need%20for%20a%20multimethod%20approach%20of%20data%20gathering.%20Beginning%20with%20an%20annual%20assessment%20of%20our%20introductory%20and%20capstone%20courses%2C%20we%20collected%20data%20through%20examination%20of%20key%20course%20artifacts%2C%20through%20department%20self-studies%2C%20which%20includes%20surveys%2C%20interviews%2C%20and%20focus%20groups%20with%20relevant%20stakeholders%2C%20and%20through%20an%20external%20review.%20Results%5C%2Fdiscussion%3A%20Our%20self-study%20data%20indicated%20that%20our%20students%20would%20benefit%20from%20stronger%20audience%20awareness%20and%20design%20competencies.%20From%20these%20data%2C%20we%20discuss%20curricular%20revisions%2C%20which%20include%20creating%20a%20UXD%20minor.%20Conclusions%3A%20We%20conclude%20this%20article%20by%20considering%20the%20following%20three%20questions%3A%201.%20What%20strategies%20might%20other%20programs%20consider%20if%20they%20want%20to%20design%20empathy-driven%20UX%20pedagogy%20that%20is%20responsive%20to%20prevailing%20scholarly%20and%20pedagogical%20trends%3F%202.%20Why%20might%20programs%20cultivate%20student-researchers%20as%20UX%20bricoleurs%3F%203.%20What%20might%20other%20programs%20expect%20from%20student-researcher%20UX%20bricoleurs%3F%22%2C%22date%22%3A%222023-12%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FTPC.2023.3320530%22%2C%22ISSN%22%3A%221558-1500%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fabstract%5C%2Fdocument%5C%2F10314838%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T21%3A45%3A11Z%22%7D%7D%2C%7B%22key%22%3A%22QPX2HG7X%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Dang%20and%20Nichols%22%2C%22parsedDate%22%3A%222023-11-22%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EDang%2C%20A.%2C%20%26amp%3B%20Nichols%2C%20B.%20S.%20%282023%29.%20The%20effects%20of%20size%20referents%20in%20user-generated%20photos%20on%20online%20review%20helpfulness.%20%3Ci%3EJournal%20of%20Consumer%20Behaviour%3C%5C%2Fi%3E%2C%20%3Ci%3En%5C%2Fa%3C%5C%2Fi%3E%28n%5C%2Fa%29.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1002%5C%2Fcb.2281%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1002%5C%2Fcb.2281%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22The%20effects%20of%20size%20referents%20in%20user-generated%20photos%20on%20online%20review%20helpfulness%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anh%22%2C%22lastName%22%3A%22Dang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bridget%20Satinover%22%2C%22lastName%22%3A%22Nichols%22%7D%5D%2C%22abstractNote%22%3A%22Product%20size%20misperceptions%20are%20a%20common%20problem%20for%20consumers%20when%20shopping%20online.%20To%20help%20their%20peers%20estimate%20a%20product%27s%20size%2C%20customers%20often%20include%20a%20size%20referent%20next%20to%20the%20focal%20product%20when%20adding%20photos%20to%20their%20online%20reviews.%20The%20purpose%20of%20this%20article%20is%20to%20investigate%20the%20effects%20of%20user-provided%20review%20photos%20%28with%20and%20without%20a%20size%20referent%29%20on%20online%20review%20helpfulness.%20From%20two%20online%20experiments%20and%20one%20eye-tracking%20lab%20experiment%2C%20this%20article%20finds%20that%20photo-enhanced%20reviews%20that%20include%20a%20size%20referent%20object%20can%20improve%20review%20helpfulness.%20In%20study%201%2C%20eye-tracking%20data%20shows%20that%20consumers%20pay%20greater%20attention%20to%20the%20photo%20and%20less%20to%20the%20text%20when%20the%20size%20referent%20is%20included.%20Study%202%20shows%20that%20size%20referents%20are%20more%20helpful%20when%20consumers%20have%20limited%20experience%20with%20the%20product%20and%20the%20size%20referent%20is%20familiar.%20The%20role%20of%20perceived%20reviewer%20effort%20is%20investigated%20as%20a%20mediator%20to%20explain%20this%20result.%20Study%203%20shows%20that%20the%20effects%20of%20a%20familiar%20size%20referent%20are%20robust%20when%20the%20seller%20provides%20detailed%20size%20information%20but%20vary%20depending%20on%20review%20valence.%22%2C%22date%22%3A%2222%20November%202023%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1002%5C%2Fcb.2281%22%2C%22ISSN%22%3A%221479-1838%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2Fabs%5C%2F10.1002%5C%2Fcb.2281%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T21%3A54%3A10Z%22%7D%7D%2C%7B%22key%22%3A%22RVRZWEIR%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Kusumo%22%2C%22parsedDate%22%3A%222023-11-19%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EKusumo%2C%20A.%20H.%20%282023%29.%20%3Ci%3EHas%20Website%20Design%20using%20Website%20Builder%20Fulfilled%20Usability%20Aspects%3F%20A%20Study%20Case%20of%20Three%20Website%20Builders%3C%5C%2Fi%3E.%20545%26%23x2013%3B557.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.2991%5C%2F978-94-6463-288-0_45%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.2991%5C%2F978-94-6463-288-0_45%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Has%20Website%20Design%20using%20Website%20Builder%20Fulfilled%20Usability%20Aspects%3F%20A%20Study%20Case%20of%20Three%20Website%20Builders%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Argo%20Hadi%22%2C%22lastName%22%3A%22Kusumo%22%7D%5D%2C%22abstractNote%22%3A%22The%20significance%20of%20e-commerce%20is%20particularly%20crucial%20for%20businesses.%20The%20enhancement%20of%20sales%20can%20be%20achieved%20through%20the%20contribution%20of%20e-commerce.%20In%20the%20current%20era%20of%20digitalization%2C%20it%20is%20unnecessary%20for%20SMEs%20to%20develop%20e-commerce%20platforms%20from%20scratch.%20Instead%2C%20they%20can%20opt%20for%20affordable%20website%20builders%20to%20facilitate%20their%20online%20business...%22%2C%22date%22%3A%222023%5C%2F11%5C%2F19%22%2C%22proceedingsTitle%22%3A%22%22%2C%22conferenceName%22%3A%224th%20International%20Conference%20on%20Informatics%2C%20Technology%20and%20Engineering%202023%20%28InCITE%202023%29%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.2991%5C%2F978-94-6463-288-0_45%22%2C%22ISBN%22%3A%22978-94-6463-288-0%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.atlantis-press.com%5C%2Fproceedings%5C%2Fincite-23%5C%2F125994398%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T19%3A18%3A10Z%22%7D%7D%2C%7B%22key%22%3A%22UIT8IJFT%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Lee%20et%20al.%22%2C%22parsedDate%22%3A%222023-11-15%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ELee%2C%20S.%2C%20Byun%2C%20G.%2C%20%26amp%3B%20Ha%2C%20M.%20%282023%29.%20Exploring%20the%20association%20between%20environmental%20factors%20and%20fear%20of%20crime%20in%20residential%20streets%3A%20an%20eye-tracking%20and%20questionnaire%20study.%20%3Ci%3EJournal%20of%20Asian%20Architecture%20and%20Building%20Engineering%3C%5C%2Fi%3E%2C%201%26%23x2013%3B18.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F13467581.2023.2278449%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F13467581.2023.2278449%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Exploring%20the%20association%20between%20environmental%20factors%20and%20fear%20of%20crime%20in%20residential%20streets%3A%20an%20eye-tracking%20and%20questionnaire%20study%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Soohyun%22%2C%22lastName%22%3A%22Lee%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gidong%22%2C%22lastName%22%3A%22Byun%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mikyoung%22%2C%22lastName%22%3A%22Ha%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222023-11-15%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1080%5C%2F13467581.2023.2278449%22%2C%22ISSN%22%3A%221346-7581%2C%201347-2852%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.tandfonline.com%5C%2Fdoi%5C%2Ffull%5C%2F10.1080%5C%2F13467581.2023.2278449%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T18%3A36%3A33Z%22%7D%7D%2C%7B%22key%22%3A%22LYCMDZ2P%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Cybulski%20et%20al.%22%2C%22parsedDate%22%3A%222023-11-12%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ECybulski%2C%20P.%2C%20Medy%26%23x144%3Bska-Gulij%2C%20B.%2C%20%26amp%3B%20Horbi%26%23x144%3Bski%2C%20T.%20%282023%29.%20Users%26%23x2019%3B%20Visual%20Experience%20During%20Temporal%20Navigation%20in%20Forecast%20Weather%20Maps%20on%20Mobile%20Devices.%20%3Ci%3EJournal%20of%20Geovisualization%20and%20Spatial%20Analysis%3C%5C%2Fi%3E%2C%20%3Ci%3E7%3C%5C%2Fi%3E%282%29%2C%2032.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs41651-023-00160-2%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs41651-023-00160-2%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Users%5Cu2019%20Visual%20Experience%20During%20Temporal%20Navigation%20in%20Forecast%20Weather%20Maps%20on%20Mobile%20Devices%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pawe%5Cu0142%22%2C%22lastName%22%3A%22Cybulski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Beata%22%2C%22lastName%22%3A%22Medy%5Cu0144ska-Gulij%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tymoteusz%22%2C%22lastName%22%3A%22Horbi%5Cu0144ski%22%7D%5D%2C%22abstractNote%22%3A%22This%20study%20investigated%20the%20impact%20of%20graphical%20user%20interface%20%28GUI%29%20design%20on%20the%20efficiency%20and%20effectiveness%20of%20map-based%20tasks%20on%20mobile%20devices%2C%20using%20time-based%20weather%20data%20as%20a%20case%20study.%20Three%20different%20GUI%20designs%20%28button-type%2C%20circle-type%2C%20and%20slidebar%29%20were%20tested%20in%20a%20between-subjects%20design%2C%20with%2050%20participants%20completing%20a%20set%20of%20map-based%20tasks%20on%20each%20GUI%20design.%20The%20results%20showed%20that%20GUI%20design%20significantly%20affected%20the%20effectiveness%20of%20map-based%20tasks.%20Participants%20performed%20better%20at%20tasks%20involving%20the%20search%20for%20the%20highest%20and%20lowest%20temperature%20amplitudes%20on%20the%20button-type%20GUI%20whereas%20the%20circle-type%20GUI%20showed%20lower%20effectiveness%20for%20tasks%20involving%20the%20search%20for%20day%20temperatures.%20Analysis%20of%20the%20visual%20attention%20distribution%20based%20on%20fixation%20count%20revealed%20that%20different%20GUI%20designs%20led%20to%20different%20patterns%20of%20visual%20attention.%20The%20study%20highlights%20the%20importance%20of%20considering%20GUI%20design%20in%20the%20development%20of%20mobile%20map%20applications%2C%20particularly%20for%20map-based%20tasks%20involving%20time-based%20data.%20The%20study%20shows%20that%20separating%20the%20date%20from%20the%20time%20navigation%20panel%20reduces%20necessary%20visual%20focus%20on%20the%20GUI%20itself%20and%20is%20a%20valuable%20insight%20for%20future%20GUI%20design.%22%2C%22date%22%3A%222023-11-12%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1007%5C%2Fs41651-023-00160-2%22%2C%22ISSN%22%3A%222509-8829%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs41651-023-00160-2%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T18%3A31%3A18Z%22%7D%7D%2C%7B%22key%22%3A%22Y9EXQIKL%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Cui%20et%20al.%22%2C%22parsedDate%22%3A%222023-11-01%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ECui%2C%20Y.%2C%20Liu%2C%20X.%2C%20%26amp%3B%20Cheng%2C%20Y.%20%282023%29.%20Reader%20perception%20of%20and%20attitude%20to%20English-Chinese%20advertising%20posters%3A%20an%20eye%20tracking%20study.%20%3Ci%3ESN%20Social%20Sciences%3C%5C%2Fi%3E%2C%20%3Ci%3E3%3C%5C%2Fi%3E%2811%29%2C%20192.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs43545-023-00782-9%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs43545-023-00782-9%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Reader%20perception%20of%20and%20attitude%20to%20English-Chinese%20advertising%20posters%3A%20an%20eye%20tracking%20study%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ying%22%2C%22lastName%22%3A%22Cui%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Xiao%22%2C%22lastName%22%3A%22Liu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yuqin%22%2C%22lastName%22%3A%22Cheng%22%7D%5D%2C%22abstractNote%22%3A%22Modern%20advertising%20posters%20are%20largely%20multimodal%2C%20with%20the%20visual%20and%20verbal%20parts%20combining%20to%20highlight%20the%20meaning.%20This%20study%20aims%20to%20explore%20how%20Chinese%20readers%20perceive%20English-Chinese%20online%20advertising%20posters%20and%20whether%20their%20patterns%20of%20perception%20are%20related%20to%20their%20attitude%20via%20an%20eye%20tracking%20experiment%20and%20an%20online%20questionnaire.%20We%20collected%2020%20English%20advertisements%20with%20Chinese%20translations%20for%20the%20experiment%20and%20recruited%2037%20Chinese%20native%20speaker%20participants.%20They%20were%20randomly%20divided%20into%20two%20groups%20and%20assigned%20to%20view%20the%20English%20and%20Chinese%20versions%20respectively.%20After%20the%20experiment%2C%20they%20filled%20out%20a%20questionnaire%20and%20rated%20those%20advertisements%20on%20a%20five-point%20Likert%20scale.%20Data%20analysis%20shows%20that%2C%20despite%20differences%20in%20the%20two%20groups%5Cu2019%20perception%20pattern%2C%20there%20is%20no%20significant%20difference%20in%20their%20attitude.%20For%20both%20groups%2C%20the%20visual%20is%20noticed%20earlier%2C%20and%20participants%5Cu2019%20preference%20for%20an%20advertisement%20is%20positively%20correlated%20to%20fixation%20on%20visuals%20but%20negatively%20correlated%20to%20fixation%20on%20texts.%20The%20findings%20suggest%20that%20more%20attention%20should%20be%20paid%20to%20visuals%20in%20English-Chinese%20advertisement%20translation%20and%20simpler%20wording%20could%20win%20consumers%5Cu2019%20favor%20better.%22%2C%22date%22%3A%222023-11-01%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1007%5C%2Fs43545-023-00782-9%22%2C%22ISSN%22%3A%222662-9283%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs43545-023-00782-9%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T18%3A44%3A13Z%22%7D%7D%2C%7B%22key%22%3A%22GWUVL44R%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22S%20Kumar%20et%20al.%22%2C%22parsedDate%22%3A%222023-10-17%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ES%20Kumar%2C%20D.%2C%20Sahadev%2C%20S.%2C%20%26amp%3B%20Purani%2C%20K.%20%282023%29.%20Visual%20Aesthetic%20Quotient%3A%20Establishing%20the%20Effects%20of%20Computational%20Aesthetic%20Measures%20for%20Servicescape%20Design.%20%3Ci%3EJournal%20of%20Service%20Research%3C%5C%2Fi%3E%2C%2010946705231205000.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F10946705231205000%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F10946705231205000%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Visual%20Aesthetic%20Quotient%3A%20Establishing%20the%20Effects%20of%20Computational%20Aesthetic%20Measures%20for%20Servicescape%20Design%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Deepak%22%2C%22lastName%22%3A%22S%20Kumar%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sunil%22%2C%22lastName%22%3A%22Sahadev%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Keyoor%22%2C%22lastName%22%3A%22Purani%22%7D%5D%2C%22abstractNote%22%3A%22Visual%20aesthetics%20play%20a%20pivotal%20role%20in%20attracting%20and%20retaining%20customers%20in%20service%20environments.%20Building%20on%20theories%20of%20environmental%20psychology%2C%20this%20study%20introduces%20a%20novel%20and%20comprehensive%20aesthetic%20measure%20for%20evaluating%20servicescape%20design%2C%20which%20is%20called%20as%20the%20%5Cu201cvisual%20aesthetic%20quotient%5Cu201d%20%28VAQ%29.%20This%20measure%20is%20presented%20as%20the%20ratio%20of%20the%20dimensions%20of%20order%20and%20complexity%20in%20servicescape%27s%20visual%20design%2C%20and%20it%20aims%20to%20provide%20an%20objective%20and%20holistic%20approach%20of%20servicescape%20design%20evaluation.%20In%20addition%2C%20we%20introduce%20and%20validate%20a%20pioneering%20method%20for%20quantifying%20order%20and%20complexity%20objectively%20using%20algorithmic%20models%20applied%20to%20servicescape%20images.%20We%20investigated%20and%20established%20the%20influence%20of%20the%20VAQ%20on%20the%20perceived%20attractiveness%20of%20servicescapes%2C%20developing%20its%20role%20further%20in%20this%20context.%20The%20entire%20approach%20was%20comprehensively%20and%20rigorously%20examined%20using%20four%20studies%20%28social%20media%20analytics%2C%20eye-tracking%2C%20a%20field%20experiment%2C%20and%20an%20experimental%20design%29%2C%20contributing%20to%20conceptual%20advancement%20and%20empirical%20testing.%20This%20study%20provides%20a%20novel%2C%20computational%2C%20objective%2C%20and%20holistic%20aesthetic%20measure%20for%20effective%20servicescape%20design%20management%20by%20validating%20computational%20aesthetic%20measures%20and%20establishing%20their%20role%20in%20influencing%20servicescape%20attractiveness%3B%20testing%20the%20mediation%20of%20processing%20fluency%20and%20pleasure%3B%20and%20examining%20the%20moderating%20effects%20of%20service%20context.%22%2C%22date%22%3A%222023-10-17%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1177%5C%2F10946705231205000%22%2C%22ISSN%22%3A%221094-6705%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F10946705231205000%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-10-19T17%3A45%3A45Z%22%7D%7D%2C%7B%22key%22%3A%22RURQ6U6W%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Cheng%20et%20al.%22%2C%22parsedDate%22%3A%222023-10-16%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ECheng%2C%20G.%2C%20Zou%2C%20D.%2C%20Xie%2C%20H.%2C%20%26amp%3B%20Lee%20Wang%2C%20F.%20%282023%29.%20Exploring%20differences%20in%20self-regulated%20learning%20strategy%20use%20between%20high-%20and%20low-performing%20students%20in%20introductory%20programming%3A%20An%20analysis%20of%20eye-tracking%20and%20retrospective%20think-aloud%20data%20from%20program%20comprehension.%20%3Ci%3EComputers%20%26amp%3B%20Education%3C%5C%2Fi%3E%2C%20104948.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.compedu.2023.104948%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.compedu.2023.104948%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Exploring%20differences%20in%20self-regulated%20learning%20strategy%20use%20between%20high-%20and%20low-performing%20students%20in%20introductory%20programming%3A%20An%20analysis%20of%20eye-tracking%20and%20retrospective%20think-aloud%20data%20from%20program%20comprehension%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gary%22%2C%22lastName%22%3A%22Cheng%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Di%22%2C%22lastName%22%3A%22Zou%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Haoran%22%2C%22lastName%22%3A%22Xie%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Fu%22%2C%22lastName%22%3A%22Lee%20Wang%22%7D%5D%2C%22abstractNote%22%3A%22Previous%20studies%20have%20reported%20mixed%20results%20regarding%20the%20relationship%20between%20students%27%20use%20of%20self-regulated%20learning%20%28SRL%29%20strategies%20and%20their%20performance%20in%20introductory%20programming%20courses.%20These%20studies%20were%20constrained%20by%20their%20reliance%20on%20self-report%20questionnaires%20as%20a%20means%20of%20collecting%20and%20analysing%20data.%20To%20address%20this%20limitation%2C%20this%20study%20aimed%20to%20employ%20eye-tracking%20and%20retrospective%20think-aloud%20techniques%20to%20identify%20differences%20in%20SRL%20strategy%20use%20for%20program%20comprehension%20tasks%20between%20high-performing%20students%20%28N%5Cu202f%3D%5Cu202f31%29%20and%20low-performing%20students%20%28N%5Cu202f%3D%5Cu202f31%29%20in%20an%20undergraduate%20programming%20course.%20All%20participants%20attended%20individual%20eye-tracking%20sessions%20to%20comprehend%20two%20Python%20program%20codes%20with%20different%20constructs.%20Their%20eye-tracking%20data%20and%20video-recalled%20retrospective%20think-aloud%20data%20were%20captured%20and%20recorded%20for%20analysis.%20The%20findings%20reveal%20that%20higher-order%20cognitive%20skills%2C%20such%20as%20elaboration%20and%20critical%20thinking%2C%20were%20mostly%20adopted%20by%20high-performing%20students%2C%20while%20basic%20cognitive%20and%20resource%20management%20strategy%2C%20such%20as%20rehearsal%20and%20help-seeking%2C%20were%20mostly%20employed%20by%20low-performing%20students%20when%20comprehending%20the%20program%20codes.%20This%20study%20not%20only%20demonstrates%20the%20design%20of%20combining%20eye-tracking%20and%20retrospective%20think-aloud%20data%20to%20explore%20students%27%20use%20of%20SRL%20strategies%20but%20also%20provides%20evidence%20to%20support%20the%20notion%20that%20program%20comprehension%20is%20a%20complex%20process%20that%20cannot%20be%20effectively%20addressed%20by%20employing%20merely%20rudimentary%20strategies%2C%20such%20as%20repetitively%20reading%20the%20same%20code%20segment.%20In%20the%20future%2C%20researchers%20could%20explore%20the%20possibility%20of%20using%20a%20webcam%20to%20monitor%20and%20assess%20students%5Cu2019%20online%20programming%20processes%20and%20provide%20feedback%20based%20on%20their%20eye%20movements.%20They%20could%20also%20examine%20the%20effects%20of%20SRL%20strategies%20training%20on%20students%27%20motivation%2C%20engagement%2C%20and%20performance%20in%20various%20types%20of%20programming%20activities.%22%2C%22date%22%3A%222023-10-16%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.compedu.2023.104948%22%2C%22ISSN%22%3A%220360-1315%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS0360131523002257%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-10-19T17%3A45%3A20Z%22%7D%7D%2C%7B%22key%22%3A%22UE65VC9Q%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Segedinac%20et%20al.%22%2C%22parsedDate%22%3A%222023-10-11%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ESegedinac%2C%20M.%2C%20Savi%26%23x107%3B%2C%20G.%2C%20Zeljkovi%26%23x107%3B%2C%20I.%2C%20Slivka%2C%20J.%2C%20%26amp%3B%20Konjovi%26%23x107%3B%2C%20Z.%20%282023%29.%20Assessing%20code%20readability%20in%20Python%20programming%20courses%20using%20eye-tracking.%20%3Ci%3EComputer%20Applications%20in%20Engineering%20Education%3C%5C%2Fi%3E%2C%20%3Ci%3En%5C%2Fa%3C%5C%2Fi%3E%28n%5C%2Fa%29.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1002%5C%2Fcae.22685%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1002%5C%2Fcae.22685%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Assessing%20code%20readability%20in%20Python%20programming%20courses%20using%20eye-tracking%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Milan%22%2C%22lastName%22%3A%22Segedinac%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Goran%22%2C%22lastName%22%3A%22Savi%5Cu0107%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ivana%22%2C%22lastName%22%3A%22Zeljkovi%5Cu0107%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jelena%22%2C%22lastName%22%3A%22Slivka%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Zora%22%2C%22lastName%22%3A%22Konjovi%5Cu0107%22%7D%5D%2C%22abstractNote%22%3A%22Code%20readability%20models%20are%20typically%20based%20on%20the%20code%27s%20structural%20and%20textual%20features%2C%20considering%20code%20readability%20as%20an%20objective%20category.%20However%2C%20readability%20is%20inherently%20subjective%20and%20dependent%20on%20the%20knowledge%20and%20experience%20of%20the%20reader%20analyzing%20the%20code.%20This%20paper%20assesses%20the%20readability%20of%20Python%20code%20statements%20commonly%20used%20in%20undergraduate%20programming%20courses.%20Our%20readability%20model%20is%20based%20on%20tracking%20the%20reader%27s%20eye%20movement%20during%20the%20while-read%20phase.%20It%20uses%20machine%20learning%20%28ML%29%20techniques%20and%20relies%20on%20a%20novel%20set%20of%20features%5Cu2014observational%20features%5Cu2014that%20capture%20how%20the%20readers%20read%20the%20code.%20We%20experimented%20by%20tracking%20the%20eye%20movement%20of%2090%20undergraduate%20students%20while%20assessing%20the%20readability%20of%2048%20Python%20code%20snippets.%20We%20trained%20an%20ML%20model%20that%20predicts%20readability%20based%20on%20the%20collected%20observational%20data%20and%20the%20code%20snippet%27s%20structural%20and%20textual%20features.%20In%20our%20experiments%2C%20the%20XGBoost%20classifier%20trained%20using%20observational%20features%20exclusively%20achieved%20the%20best%20results%20%280.85%20F-measure%29.%20Using%20correlation%20analysis%2C%20we%20identified%20Python%20statements%20most%20affecting%20readability%20for%20undergraduate%20students%20and%20proposed%20implications%20for%20teaching%20Python%20programming.%20In%20line%20with%20findings%20for%20Java%20language%2C%20we%20found%20that%20constructs%20related%20to%20the%20code%27s%20size%20and%20complexity%20hurt%20the%20code%27s%20readability.%20Numerous%20comments%20also%20hindered%20readability%2C%20potentially%20due%20to%20their%20association%20with%20less%20readable%20code.%20Some%20Python-specific%20statements%20%28list%20comprehension%2C%20lambda%20function%2C%20and%20dictionary%20comprehension%29%20harmed%20code%20readability%2C%20even%20though%20they%20were%20part%20of%20the%20curriculum.%20Tracking%20students%27%20gaze%20indicated%20some%20additional%20factors%2C%20most%20notably%20nonlinearity%20introduced%20by%20if%2C%20for%2C%20while%2C%20try%2C%20and%20function%20call%20statements.%22%2C%22date%22%3A%2211%20October%202023%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1002%5C%2Fcae.22685%22%2C%22ISSN%22%3A%221099-0542%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2Fabs%5C%2F10.1002%5C%2Fcae.22685%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-10-20T18%3A22%3A44Z%22%7D%7D%2C%7B%22key%22%3A%22NFJXGICC%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Novia%20et%20al.%22%2C%22parsedDate%22%3A%222023-10-09%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ENovia%2C%20R.%2C%20Titis%2C%20W.%2C%20%26amp%3B%20Mirwan%2C%20U.%20%282023%29.%20An%20eye%20tracking%20study%20of%20customers%26%23x2019%3B%20visual%20attention%20to%20the%20fast-food%20chain%26%23x2019%3Bs%20page%20on%20instagram.%20%3Ci%3EAIP%20Conference%20Proceedings%3C%5C%2Fi%3E%2C%20%3Ci%3E2510%3C%5C%2Fi%3E%281%29%2C%20030042.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1063%5C%2F5.0129351%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1063%5C%2F5.0129351%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22An%20eye%20tracking%20study%20of%20customers%5Cu2019%20visual%20attention%20to%20the%20fast-food%20chain%5Cu2019s%20page%20on%20instagram%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22R.%22%2C%22lastName%22%3A%22Novia%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22W.%22%2C%22lastName%22%3A%22Titis%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22U.%22%2C%22lastName%22%3A%22Mirwan%22%7D%5D%2C%22abstractNote%22%3A%22This%20study%20investigates%20consumer%20visual%20attention%20to%20the%20fast%20food%20chain%5Cu2019s%20page%20on%20Instagram%20and%20its%20impact%20on%20purchase%20intentions.%20The%20study%20used%20an%20eye%20tracking%20device%20to%20examine%20the%20sequences%20of%2011%20participants%20%28mean%20of%20age%2024%20years%29%20when%20tracking%20information%20on%20the%20Instagram%20page%20of%20fast-food%20chain.%20The%20study%20focused%20on%20the%20main%20page%20and%20post%20page%20of%20a%20global%20and%20local%20brand.%20The%20data%20were%20analyzed%20descriptively%2C%20parametrically%2C%20and%20non-parametrically.%20The%20results%20showed%20that%20the%20sequences%20of%20participants%20did%20not%20differ%20in%20both%20types%20of%20brands.%20However%2C%20the%20results%20of%20the%20analysis%20showed%20that%20the%20duration%20of%20fixation%20differed%20significantly%20on%20the%20AOI%20on%20the%20main%20page%28F%281.209%2C12.092%29%3D59.006%2C%20p%26lt%3B.001%29%20and%20on%20the%20post%20page%28X2%282%29%3D30.240%2C%20p%26lt%3B.001%29.%20Multiple%20linear%20regression%20tests%20were%20also%20used%20in%20this%20study%20to%20look%20at%20the%20contribution%20value%20of%20each%20AOI%20to%20buying%20interest.%20The%20results%20showed%20that%20the%20contribution%20value%20of%20AOI%20was%20stated%20as%20much%20as%2031.2%25%20variance%20in%20consumer%20buying%20interest%2C%20with%20regression%20equation%3B%20Buy%20interest%20%28Y%29%20%3D%20%28-0.135%2AAOI1%29%20%2B%20%28-0.272%2AAOI2%29%20%2B%20%28-0.161%2AAOI3%29%2B%20%28-0.60%2AAOI4%29%2B%28-0.128%2AAOI5%29%2B%20%28-0.213%2AAOI6%29plus%3B%20%280.063%2AAOI7%29%2B5.00.%22%2C%22date%22%3A%222023-10-09%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1063%5C%2F5.0129351%22%2C%22ISSN%22%3A%220094-243X%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1063%5C%2F5.0129351%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-10-20T18%3A23%3A17Z%22%7D%7D%2C%7B%22key%22%3A%22R7JLD7KU%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hwang%20and%20Lee%22%2C%22parsedDate%22%3A%222023-10-08%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHwang%2C%20E.%2C%20%26amp%3B%20Lee%2C%20J.%20%282023%29.%20Attention-based%20automatic%20editing%20of%20virtual%20lectures%20for%20reduced%20production%20labor%20and%20effective%20learning%20experience.%20%3Ci%3EInternational%20Journal%20of%20Human-Computer%20Studies%3C%5C%2Fi%3E%2C%20103161.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.ijhcs.2023.103161%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.ijhcs.2023.103161%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Attention-based%20automatic%20editing%20of%20virtual%20lectures%20for%20reduced%20production%20labor%20and%20effective%20learning%20experience%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eugene%22%2C%22lastName%22%3A%22Hwang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jeongmi%22%2C%22lastName%22%3A%22Lee%22%7D%5D%2C%22abstractNote%22%3A%22Recently%20there%20has%20been%20a%20surge%20in%20demand%20for%20online%20video-based%20learning%2C%20and%20the%20importance%20of%20high-quality%20educational%20videos%20is%20ever-growing.%20However%2C%20a%20uniform%20format%20of%20videos%20that%20neglects%20individual%20differences%20and%20the%20labor-intensive%20process%20of%20editing%20are%20major%20setbacks%20in%20producing%20effective%20educational%20videos.%20This%20study%20aims%20to%20resolve%20the%20issues%20by%20proposing%20an%20automatic%20lecture%20video%20editing%20pipeline%20based%20on%20each%20individual%5Cu2019s%20attention%20pattern.%20In%20this%20pipeline%2C%20the%20eye-tracking%20data%20are%20obtained%20while%20each%20individual%20watches%20virtual%20lectures%2C%20which%20later%20go%20through%20multiple%20filters%20to%20define%20the%20viewer%5Cu2019s%20locus%20of%20attention%20and%20to%20select%20the%20appropriate%20shot%20at%20each%20time%20point%20to%20create%20personalized%20videos.%20To%20assess%20the%20effectiveness%20of%20the%20proposed%20method%2C%20video%20characteristics%2C%20subjective%20evaluations%20of%20the%20learning%20experience%2C%20and%20objective%20eye-movement%20features%20were%20compared%20between%20differently%20edited%20videos%20%28attention-based%2C%20randomly%20edited%2C%20professionally%20edited%29.%20The%20results%20showed%20that%20our%20method%20dramatically%20reduced%20the%20editing%20time%2C%20with%20similar%20video%20characteristics%20to%20those%20of%20professionally%20edited%20versions.%20Attention-based%20versions%20were%20also%20evaluated%20to%20be%20significantly%20better%20than%20randomly%20edited%20ones%2C%20and%20as%20effective%20as%20professionally%20edited%20ones.%20Eye-tracking%20results%20indicated%20that%20attention-based%20videos%20have%20the%20potential%20to%20decrease%20the%20cognitive%20load%20of%20learners.%20These%20results%20suggest%20that%20attention-based%20automatic%20editing%20can%20be%20a%20viable%20or%20even%20a%20better%20alternative%20to%20the%20human%20expert-dependent%20approach%2C%20and%20individually-tailored%20videos%20have%20the%20potential%20to%20heighten%20the%20learning%20experience%20and%20effect.%22%2C%22date%22%3A%222023-10-08%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.ijhcs.2023.103161%22%2C%22ISSN%22%3A%221071-5819%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS1071581923001702%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-10-20T18%3A24%3A06Z%22%7D%7D%2C%7B%22key%22%3A%22WYRDL5EU%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Biondi%20et%20al.%22%2C%22parsedDate%22%3A%222023-10-05%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EBiondi%2C%20F.%20N.%2C%20Graf%2C%20F.%2C%20Pillai%2C%20P.%2C%20%26amp%3B%20Balasingam%2C%20B.%20%282023%29.%20On%20validating%20a%20generic%20camera-based%20blink%20detection%20system%20for%20cognitive%20load%20assessment.%20%3Ci%3ECognitive%20Computation%20and%20Systems%3C%5C%2Fi%3E%2C%20%3Ci%3En%5C%2Fa%3C%5C%2Fi%3E%28n%5C%2Fa%29.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1049%5C%2Fccs2.12088%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1049%5C%2Fccs2.12088%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22On%20validating%20a%20generic%20camera-based%20blink%20detection%20system%20for%20cognitive%20load%20assessment%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Francesco%20N.%22%2C%22lastName%22%3A%22Biondi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Frida%22%2C%22lastName%22%3A%22Graf%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Prarthana%22%2C%22lastName%22%3A%22Pillai%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Balakumar%22%2C%22lastName%22%3A%22Balasingam%22%7D%5D%2C%22abstractNote%22%3A%22Detecting%20the%20human%20operator%27s%20cognitive%20state%20is%20paramount%20in%20settings%20wherein%20maintaining%20optimal%20workload%20is%20necessary%20for%20task%20performance.%20Blink%20rate%20is%20an%20established%20metric%20of%20cognitive%20load%2C%20with%20a%20higher%20blink%20frequency%20being%20observed%20under%20conditions%20of%20greater%20workload.%20Measuring%20blink%20rate%20requires%20the%20use%20of%20eye-trackers%20which%20limits%20the%20adoption%20of%20this%20metric%20in%20the%20real-world.%20The%20authors%20aim%20to%20investigate%20the%20effectiveness%20of%20using%20a%20generic%20camera-based%20system%20as%20a%20way%20to%20assess%20the%20user%27s%20cognitive%20load%20during%20a%20computer%20task.%20Participants%20completed%20a%20mental%20task%20while%20sitting%20in%20front%20of%20a%20computer.%20Blink%20rate%20was%20recorded%20via%20both%20the%20generic%20camera-based%20system%20and%20a%20scientific-grade%20eye-tracker%20for%20validation%20purposes.%20Cognitive%20load%20was%20also%20assessed%20through%20the%20performance%20in%20a%20single%20stimulus%20detection%20task.%20The%20blink%20rate%20recorded%20via%20the%20generic%20camera-based%20approach%20did%20not%20differ%20from%20the%20one%20obtained%20through%20the%20eye-tracker.%20No%20meaningful%20changes%20in%20blink%20rate%20were%20however%20observed%20with%20increasing%20cognitive%20load.%20Results%20show%20the%20generic-camera%20based%20system%20may%20represent%20a%20more%20affordable%2C%20ubiquitous%20means%20for%20assessing%20cognitive%20workload%20during%20computer%20task.%20Future%20work%20should%20further%20investigate%20ways%20to%20increase%20its%20accuracy%20during%20the%20completion%20of%20more%20realistic%20tasks.%22%2C%22date%22%3A%22October%205%202023%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1049%5C%2Fccs2.12088%22%2C%22ISSN%22%3A%222517-7567%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2Fabs%5C%2F10.1049%5C%2Fccs2.12088%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-10-10T20%3A34%3A54Z%22%7D%7D%2C%7B%22key%22%3A%222QX8A682%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Contemori%20et%20al.%22%2C%22parsedDate%22%3A%222023-10-05%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EContemori%2C%20G.%2C%20Oletto%2C%20C.%20M.%2C%20Battaglini%2C%20L.%2C%20Motterle%2C%20E.%2C%20%26amp%3B%20Bertamini%2C%20M.%20%282023%29.%20Foveal%20feedback%20in%20perceptual%20processing%3A%20Contamination%20of%20neural%20representations%20and%20task%20difficulty%20effects.%20%3Ci%3EPLOS%20ONE%3C%5C%2Fi%3E%2C%20%3Ci%3E18%3C%5C%2Fi%3E%2810%29%2C%20e0291275.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1371%5C%2Fjournal.pone.0291275%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1371%5C%2Fjournal.pone.0291275%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Foveal%20feedback%20in%20perceptual%20processing%3A%20Contamination%20of%20neural%20representations%20and%20task%20difficulty%20effects%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Giulio%22%2C%22lastName%22%3A%22Contemori%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Carolina%20Maria%22%2C%22lastName%22%3A%22Oletto%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Luca%22%2C%22lastName%22%3A%22Battaglini%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Elena%22%2C%22lastName%22%3A%22Motterle%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marco%22%2C%22lastName%22%3A%22Bertamini%22%7D%5D%2C%22abstractNote%22%3A%22Visual%20object%20recognition%20was%20traditionally%20believed%20to%20rely%20on%20a%20hierarchical%20feedforward%20process.%20However%2C%20recent%20evidence%20challenges%20this%20notion%20by%20demonstrating%20the%20crucial%20role%20of%20foveal%20retinotopic%20cortex%20and%20feedback%20signals%20from%20higher-level%20visual%20areas%20in%20processing%20peripheral%20visual%20information.%20The%20nature%20of%20the%20information%20conveyed%20through%20foveal%20feedback%20remains%20a%20topic%20of%20debate.%20To%20address%20this%2C%20we%20conducted%20a%20study%20employing%20a%20foveal%20mask%20paradigm%20with%20varying%20stimulus-mask%20onset%20asynchronies%20in%20a%20peripheral%20same%5C%2Fdifferent%20task%2C%20where%20peripheral%20objects%20exhibited%20different%20degrees%20of%20similarity.%20Our%20hypothesis%20posited%20that%20simultaneous%20arrival%20of%20feedback%20and%20mask%20information%20in%20the%20foveal%20cortex%20would%20lead%20to%20neural%20contamination%2C%20biasing%20perception.%20Notably%2C%20when%20the%20two%20peripheral%20objects%20were%20identical%2C%20we%20observed%20a%20significant%20increase%20in%20the%20number%20of%20%5C%22different%5C%22%20responses%2C%20peaking%20at%20approximately%20100%20ms.%20Similar%20effect%20was%20found%20when%20the%20objects%20were%20dissimilar%2C%20but%20with%20an%20overall%20later%20timing%20%28around%20150%20ms%29.%20No%20significant%20difference%20was%20found%20when%20comparing%20easy%20%28dissimilar%20objects%29%20and%20difficult%20trials%20%28similar%20objects%29.%20The%20findings%20challenge%20the%20hypothesis%20that%20foveation%20planning%20alone%20accounts%20for%20the%20observed%20effects.%20Instead%2C%20these%20and%20previous%20observations%20support%20the%20notion%20that%20the%20foveal%20cortex%20serves%20as%20a%20visual%20sketchpad%20for%20maintaining%20and%20manipulating%20task-relevant%20information.%22%2C%22date%22%3A%22Oct%205%2C%202023%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1371%5C%2Fjournal.pone.0291275%22%2C%22ISSN%22%3A%221932-6203%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fjournals.plos.org%5C%2Fplosone%5C%2Farticle%3Fid%3D10.1371%5C%2Fjournal.pone.0291275%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-10-10T20%3A33%3A57Z%22%7D%7D%2C%7B%22key%22%3A%22SXW84BDZ%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Kamal%20et%20al.%22%2C%22parsedDate%22%3A%222023-10-04%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EKamal%2C%20M.%2C%20M%26%23xF6%3Bbius%2C%20M.%2C%20Bartella%2C%20A.%20K.%2C%20%26amp%3B%20Lethaus%2C%20B.%20%282023%29.%20Perception%20of%20aesthetic%20features%20after%20surgical%20treatment%20of%20craniofacial%20malformations%20by%20observers%20of%20the%20same%20age%3A%20An%20eye-tracking%20study.%20%3Ci%3EJournal%20of%20Cranio-Maxillofacial%20Surgery%3C%5C%2Fi%3E.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.jcms.2023.09.009%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.jcms.2023.09.009%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Perception%20of%20aesthetic%20features%20after%20surgical%20treatment%20of%20craniofacial%20malformations%20by%20observers%20of%20the%20same%20age%3A%20An%20eye-tracking%20study%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mohammad%22%2C%22lastName%22%3A%22Kamal%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marianne%22%2C%22lastName%22%3A%22M%5Cu00f6bius%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alexander%20K.%22%2C%22lastName%22%3A%22Bartella%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bernd%22%2C%22lastName%22%3A%22Lethaus%22%7D%5D%2C%22abstractNote%22%3A%22The%20aim%20of%20this%20study%20is%20to%20evaluate%20where%20exactly%20children%20and%20adolescents%20of%20the%20same%20group%20look%20when%20they%20interact%20with%20each%20other%2C%20and%20attempt%20to%20record%20and%20analyse%20the%20data%20recorded%20by%20eye-tracking%20technology.%5CnMaterials%20and%20methods%5Cn60%20subjects%20participated%20in%20the%20study%2C%20evenly%20divided%20into%20three%20age%20categories%20of%2020%20each%20in%20pre-school%5C%2Fprimary%20school%20age%20%285%5Cu20139%20years%29%2C%20early%20adolescence%20%2810%5Cu201314%20years%29%20and%20late%20adolescence%5C%2Ftransition%20to%20adulthood%20%2815%5Cu201319%20years%29.%20Age%20groups%20were%20matched%20and%20categorized%20to%20be%20used%20both%20for%20creating%20the%20picture%20series%20and%20testing.%20Photographs%20of%20patients%20with%20both%20unilateral%20and%20bilateral%20cleft%20lip%20and%20palate%20were%20used%20to%20create%20the%20series%20of%20images%20which%20consisted%20of%20a%20total%20of%2015%20photos%2C%205%20of%20which%20were%20photos%20of%20patients%20with%20surgically%20treated%20cleft%20deformity%20and%2010%20control%20photos%20with%20healthy%20faces%2C%20that%20were%20presented%20in%20random%20order.%20Using%20the%20eye-tracking%20module%2C%20the%20data%20on%20%5C%22area%20of%20first%20view%5C%22%20%28area%20of%20initial%20attention%29%2C%20%5C%22area%20with%20longest%20view%5C%22%20%28area%20of%20sustained%20attention%29%2C%20%5C%22time%20until%20view%20in%20this%20area%5C%22%20%28time%20of%20initial%20attention%29%20and%20%5C%22frequency%20of%20view%20in%20each%20area%5C%22%20%28time%20of%20sustained%20attention%29%20were%20calculated.%5CnResults%5CnAcross%20all%20groups%2C%20there%20was%20no%20significant%20difference%20for%20the%20individual%20regions%20for%20the%20parameters%20of%20initial%20attention%20%28area%20of%20first%20view%29%2C%20while%20the%20time%20until%20first%20fixation%20of%20one%20of%20the%20AOIs%20%28time%20until%20view%20in%20this%20area%29%20was%20significant%20for%20all%20facial%20regions.%20A%20predictable%20path%20of%20the%20facial%20scan%20is%20abandoned%20when%20secondary%20facial%20deformities%20are%20present%20and%20attention%20is%20focused%20more%20on%20the%20region%20of%20an%20existing%20deformity%2C%20which%20are%20the%20nose%20and%20mouth%20regions.%5CnConclusions%5CnThere%20are%20significant%20differences%20in%20both%20male%20and%20female%20participants%27%20viewing%20of%20faces%20with%20and%20without%20secondary%20cleft%20deformity.%20While%20in%20the%20age%20group%20of%20the%20younger%20test%20persons%20it%20was%20still%20the%20mouth%20region%20that%20received%20special%20attention%20from%20the%20male%20viewers%2C%20this%20shifted%20in%20the%20male%20test%20persons%20of%20the%20middle%20age%20group%20to%20the%20nose%20region%2C%20which%20was%20fixed%20significantly%20more%20often%20and%20faster.%20In%20the%20female%20participants%2C%20the%20mouth%20and%20nose%20regions%20were%20each%20looked%20at%20for%20twice%20as%20long%20compared%20to%20the%20healthy%20faces%2C%20making%20both%20the%20mouth%20and%20the%20nose%20region%20are%20in%20the%20focus%20of%20observation.%22%2C%22date%22%3A%222023-10-04%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.jcms.2023.09.009%22%2C%22ISSN%22%3A%221010-5182%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS101051822300183X%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-10-10T20%3A29%3A58Z%22%7D%7D%2C%7B%22key%22%3A%227HNL8HJF%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Inoue%20et%20al.%22%2C%22parsedDate%22%3A%222023-10%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EInoue%2C%20M.%2C%20Nishiyama%2C%20M.%2C%20%26amp%3B%20Iwai%2C%20Y.%20%282023%29.%20Age%20group%20identification%20using%20gaze-guided%20feature%20extraction.%20%3Ci%3E2023%20IEEE%2012th%20Global%20Conference%20on%20Consumer%20Electronics%20%28GCCE%29%3C%5C%2Fi%3E%2C%20708%26%23x2013%3B711.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FGCCE59613.2023.10315305%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FGCCE59613.2023.10315305%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Age%20group%20identification%20using%20gaze-guided%20feature%20extraction%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michiko%22%2C%22lastName%22%3A%22Inoue%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Masashi%22%2C%22lastName%22%3A%22Nishiyama%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yoshio%22%2C%22lastName%22%3A%22Iwai%22%7D%5D%2C%22abstractNote%22%3A%22When%20an%20observer%20identifies%20the%20age%20group%20of%20a%20subject%20in%20an%20image%2C%20the%20observer%5Cu2019s%20gaze%20focuses%20on%20a%20region%20containing%20an%20informative%20feature.%20Our%20aim%20is%20to%20improve%20the%20accuracy%20of%20age%20group%20identification%20by%20extracting%20features%20from%20the%20regions%20where%20an%20observer%5Cu2019s%20gaze%20converges.%20Existing%20studies%20have%20analysed%20observer%20gaze%20on%20facial%20images%2C%20but%20not%20on%20images%20containing%20the%20subject%5Cu2019s%20whole%20body.%20Here%2C%20we%20analysed%20which%20regions%20of%20the%20whole-body%20image%20an%20observer%5Cu2019s%20gaze%20focused%20on%20while%20the%20observer%20performed%20this%20task.%20The%20experimental%20results%20revealed%20that%20an%20observer%5Cu2019s%20gaze%20is%20drawn%20to%20the%20head%20of%20the%20subject%20regardless%20of%20the%20subject%5Cu2019s%20age%20group.%20They%20also%20revealed%20that%20the%20gaze-guided%20feature%20extraction%20in%20deep%20learning%20and%20machine%20learning%20improved%20the%20accuracy%20of%20age%20group%20identification.%22%2C%22date%22%3A%222023-10%22%2C%22proceedingsTitle%22%3A%222023%20IEEE%2012th%20Global%20Conference%20on%20Consumer%20Electronics%20%28GCCE%29%22%2C%22conferenceName%22%3A%222023%20IEEE%2012th%20Global%20Conference%20on%20Consumer%20Electronics%20%28GCCE%29%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FGCCE59613.2023.10315305%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fabstract%5C%2Fdocument%5C%2F10315305%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-23T19%3A38%3A03Z%22%7D%7D%2C%7B%22key%22%3A%222BDIX2YW%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Sims%20et%20al.%22%2C%22parsedDate%22%3A%222023-09-20%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ESims%2C%20J.%20P.%2C%20Haynes%2C%20A.%2C%20%26amp%3B%20Lanius%2C%20C.%20%282023%29.%20Exploring%20the%20utility%20of%20eye%20tracking%20for%20sociological%20research%20on%20race.%20%3Ci%3EThe%20British%20Journal%20of%20Sociology%3C%5C%2Fi%3E%2C%20%3Ci%3En%5C%2Fa%3C%5C%2Fi%3E%28n%5C%2Fa%29.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1111%5C%2F1468-4446.13054%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1111%5C%2F1468-4446.13054%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Exploring%20the%20utility%20of%20eye%20tracking%20for%20sociological%20research%20on%20race%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jennifer%20Patrice%22%2C%22lastName%22%3A%22Sims%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alex%22%2C%22lastName%22%3A%22Haynes%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Candice%22%2C%22lastName%22%3A%22Lanius%22%7D%5D%2C%22abstractNote%22%3A%22One%20part%20of%20the%20social%20construction%20of%20race%20is%20the%20symbolic%20association%20of%20given%20physical%20features%20with%20different%20races.%20This%20research%20note%20explores%20the%20utility%20of%20eye%20tracking%20for%20sociological%20research%20on%20racial%20perception%2C%20that%20is%2C%20for%20determining%20what%20race%20someone%20%5Cu2018looks%20like.%5Cu2019%20Results%20reveal%20that%20participants%20gave%20greatest%20attention%20to%20targets%27%20hair.%20This%20was%20especially%20so%20when%20targets%20of%20all%20races%20had%20straight%20hair%20or%20when%20a%20target%20identified%20as%20Black%5C%2FWhite%20mixed-race.%20The%20mixed-race%20results%20in%20particular%20provide%20physiological%20evidence%20of%20the%20theory%20of%20multiracial%20dissection.%20We%20conclude%20by%20suggesting%20that%20eye%20tracking%20can%20be%20useful%20to%20sociologists%20by%20revealing%20subconscious%20tendencies%20and%20biases%20which%2C%20once%20identified%2C%20can%20be%20consciously%20addressed%20in%20service%20to%20reducing%20social%20disparities.%22%2C%22date%22%3A%2220%20September%202023%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1111%5C%2F1468-4446.13054%22%2C%22ISSN%22%3A%221468-4446%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2Fabs%5C%2F10.1111%5C%2F1468-4446.13054%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-09-28T18%3A27%3A15Z%22%7D%7D%2C%7B%22key%22%3A%22HW6BF7SB%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Aslan%20et%20al.%22%2C%22parsedDate%22%3A%222023-09-18%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EAslan%2C%20M.%2C%20Baykara%2C%20M.%2C%20%26amp%3B%20Alaku%26%23x15F%3B%2C%20T.%20B.%20%282023%29.%20LSTMNCP%3A%20lie%20detection%20from%20EEG%20signals%20with%20novel%20hybrid%20deep%20learning%20method.%20%3Ci%3EMultimedia%20Tools%20and%20Applications%3C%5C%2Fi%3E.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs11042-023-16847-z%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs11042-023-16847-z%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22LSTMNCP%3A%20lie%20detection%20from%20EEG%20signals%20with%20novel%20hybrid%20deep%20learning%20method%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Musa%22%2C%22lastName%22%3A%22Aslan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Muhammet%22%2C%22lastName%22%3A%22Baykara%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Talha%20Burak%22%2C%22lastName%22%3A%22Alaku%5Cu015f%22%7D%5D%2C%22abstractNote%22%3A%22Lying%20has%20become%20an%20element%20of%20human%20nature.%20People%20lie%20intentionally%20or%20unintentionally%20at%20any%20point%20in%20their%20lives.%20Human%20beings%20can%20deceive%20by%20lying%20to%20justify%20themselves%20about%20something%20or%20to%20get%20rid%20of%20a%20wrongdoing.%20This%20lie%5Cu00a0can%20result%20in%20various%20consequences%2C%20including%20health%20deterioration%2C%20loss%20of%20life%2C%20a%20sense%20of%20insecurity%2C%20criminal%20behaviors%2C%20and%20more.%20Such%20situations%20are%20more%20common%20especially%20in%20daily%20life%2C%20security%2C%20and%20criminology.%20In%20these%20cases%2C%20lie%20detection%20is%20of%20vital%20importance.%20With%20the%20development%20of%20technology%2C%20lie%20detection%20becomes%20a%20more%20important%20issue.%20People%20can%20manipulate%20others%20and%20provide%20information%20by%20lying.%20This%20situation%20has%20led%20researchers%20to%20turn%20to%20more%20alternative%20ways%20and%20the%20importance%20of%20EEG%20signals%20has%20increased.%20Since%20EEG%20signals%20are%20difficult%20to%20manipulate%2C%20there%20has%20been%20an%20increase%20in%20their%20use%20and%20analysis%20in%20lie%20detection%20studies.%20In%20this%20study%2C%20lie%20detection%20was%20performed%20with%20EEG%20signals%20and%20the%20importance%20of%20EEG%20signals%20was%20demonstrated.%20Within%20the%20scope%20of%20this%20study%2C%20a%20novel%20hybrid%20deep%20learning%20method%20was%20designed%20on%20the%20Bag-of-Lies%20dataset%2C%20which%20was%20created%20using%20different%20methods%2C%20and%20lie%20detection%20was%20carried%20out.%20The%20study%20consisted%20of%20four%20stages.%20In%20the%20first%20stage%2C%20EEG%20data%20were%20obtained%20from%20the%20Bag-of-Lies%20dataset.%20In%20the%20second%20stage%2C%20the%20data%20were%20decomposed%20into%20sub-signals%20by%20DWT%20method.%20These%20signals%2C%20which%20were%20separated%20in%20the%20third%20stage%2C%20were%20classified%20with%20the%20designed%20novel%20hybrid%20deep%20learning%20model.%20At%20the%20last%20stage%20of%20the%20study%2C%20the%20performance%20of%20the%20classifier%20was%20determined%20by%20accuracy%2C%20precision%2C%20recall%2C%20F1-score%2C%20and%20AUC%20score.%5Cu00a0At%20the%20conclusion%20of%20the%20research%2C%20an%20accuracy%20score%20of%2097.88%25%20was%20achieved%2C%20demonstrating%20the%20significance%20of%20EEG%20signals%20in%20this%20domain.%22%2C%22date%22%3A%222023-09-18%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1007%5C%2Fs11042-023-16847-z%22%2C%22ISSN%22%3A%221573-7721%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs11042-023-16847-z%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-09-21T18%3A19%3A01Z%22%7D%7D%2C%7B%22key%22%3A%22RDM4AYFA%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Yao%20et%20al.%22%2C%22parsedDate%22%3A%222023-09-12%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EYao%2C%20J.%2C%20Su%2C%20S.%2C%20%26amp%3B%20Liu%2C%20S.%20%282023%29.%20The%20effect%20of%20key%20audit%20matters%20reviewing%20on%20loan%20approval%20decisions%3F%20%3Ci%3EFinance%20Research%20Letters%3C%5C%2Fi%3E%2C%20104467.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.frl.2023.104467%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.frl.2023.104467%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22The%20effect%20of%20key%20audit%20matters%20reviewing%20on%20loan%20approval%20decisions%3F%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jie%22%2C%22lastName%22%3A%22Yao%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shengqi%22%2C%22lastName%22%3A%22Su%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shanmin%22%2C%22lastName%22%3A%22Liu%22%7D%5D%2C%22abstractNote%22%3A%22The%20existing%20literature%20does%20not%20provide%20a%20definitive%20statement%20on%20whether%20the%20inclusion%20of%20Key%20Audit%20Matters%20%28KAM%29%20in%20accounting%20research%20increases%20the%20likelihood%20of%20loan%20approval%20by%20bank%20approvers.%20This%20study%20investigates%20the%20combined%20effects%20of%20approvers%27%20trait%20skepticism%2C%20their%20professional%20background%20in%20accounting%20and%20auditing%2C%20as%20well%20as%20the%20familiarity%20and%20readability%20of%20KAM%20content%2C%20on%20loan%20approval%20probability.%20The%20findings%20indicate%20that%20an%20increased%20duration%20spent%20reading%20%27How%27%20paragraph%20enhances%20the%20probability%20of%20loan%20approval%20by%20approvers.%20Moreover%2C%20this%20positive%20correlation%20is%20more%20pronounced%20for%20familiar%20and%20easily%20legible%20content%2C%20highly%20skeptical%20individuals%2C%20and%20those%20with%20expertise%20in%20accounting%20and%20auditing.%20These%20results%20offer%20novel%20insights%20and%20research%20perspectives%20for%20KAM%20while%20expanding%20upon%20the%20literature%20concerning%20factors%20influencing%20approvers%27%20loan%20approvals.%22%2C%22date%22%3A%222023-09-12%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.frl.2023.104467%22%2C%22ISSN%22%3A%221544-6123%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS1544612323008395%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-09-18T16%3A12%3A54Z%22%7D%7D%2C%7B%22key%22%3A%22LIS54HUJ%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Furukado%20and%20Hagiwara%22%2C%22parsedDate%22%3A%222023-09-01%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EFurukado%2C%20R.%2C%20%26amp%3B%20Hagiwara%2C%20G.%20%282023%29.%20%3Ci%3EGaze%20and%20Electroencephalography%20%28EEG%29%20Parameters%20in%20Esports%3A%20Examinations%20Considering%20Genres%20and%20Skill%20Levels%3C%5C%2Fi%3E.%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22book%22%2C%22title%22%3A%22Gaze%20and%20Electroencephalography%20%28EEG%29%20Parameters%20in%20Esports%3A%20Examinations%20Considering%20Genres%20and%20Skill%20Levels%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ryousuke%22%2C%22lastName%22%3A%22Furukado%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Goichi%22%2C%22lastName%22%3A%22Hagiwara%22%7D%5D%2C%22abstractNote%22%3A%22This%20study%20analyzed%20visual%20search%20and%20electroencephalography%20%28EEG%29%20data%20differences%20between%20skilled%20and%20semiskilled%20players%20in%20first-person%20shooting%20%28FPS%29%20and%20multiplayer%20online%20battle%20arena%20%28MOBA%29%20games.%20The%20results%20showed%20that%20skilled%20FPS%20players%20gazed%20more%20at%20the%20reticle%20and%20minimap%20areas%20and%20had%20stronger%20EEG%20activity%20during%20the%20flow%20state%20than%20semiskilled%20players.%20Skilled%20MOBA%20players%20gazed%20more%20at%20the%20user%20interfaces%2C%20particularly%20the%20minimap%2C%20but%20paid%20less%20attention%20during%20gameplay.%20Combining%20eye%20gaze%20and%20EEG%20data%20identified%20differences%20in%20esports%20by%20genre%20and%20skill%20level.%22%2C%22date%22%3A%222023-09-01%22%2C%22language%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-23T01%3A24%3A54Z%22%7D%7D%2C%7B%22key%22%3A%22LIK8PQ66%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Liu%20et%20al.%22%2C%22parsedDate%22%3A%222023-09-01%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ELiu%2C%20Y.%2C%20Ghaiumy%20Anaraky%2C%20R.%2C%20Aly%2C%20H.%2C%20%26amp%3B%20Byrne%2C%20K.%20%282023%29.%20The%20Effect%20of%20Privacy%20Fatigue%20on%20Privacy%20Decision-Making%20Behavior.%20%3Ci%3EProceedings%20of%20the%20Human%20Factors%20and%20Ergonomics%20Society%20Annual%20Meeting%3C%5C%2Fi%3E%2C%20%3Ci%3E67%3C%5C%2Fi%3E%281%29%2C%202428%26%23x2013%3B2433.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F21695067231193670%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F21695067231193670%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22The%20Effect%20of%20Privacy%20Fatigue%20on%20Privacy%20Decision-Making%20Behavior%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yizhou%22%2C%22lastName%22%3A%22Liu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Reza%22%2C%22lastName%22%3A%22Ghaiumy%20Anaraky%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Heba%22%2C%22lastName%22%3A%22Aly%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kaileigh%22%2C%22lastName%22%3A%22Byrne%22%7D%5D%2C%22abstractNote%22%3A%22With%20the%20rapid%20growth%20of%20information%20communication%20technology%2C%20internet%20users%20face%20constantly%20demands%20to%20manage%20privacy%20setting%20and%20make%20decisions%20on%20whether%20to%20disclose%20privacy%20data%20in%20the%20digital%20environment.%20Such%20privacy-related%20decisions%20and%20actions%20are%20heavily%20influenced%20by%20users%5Cu2019%20level%20of%20trust%2C%20privacy%20concerns%2C%20perceived%20risks%2C%20and%20perceived%20benefits.%20One%20recently%20recognized%20factor%20that%20may%20impair%20privacy%20behaviors%20is%20privacy%20fatigue.%20This%20study%20will%20employ%20a%20within-subjects%20design%20with%20eye-tracking%20devices%20in%20which%20participants%20complete%20the%20same%20online%20decision-making%20task.%20Additionally%2C%20potential%20covariates%2C%20including%20privacy%20concern%20and%20digital%20privacy%20literacy%2C%20will%20be%20controlled%20for.%20We%20expect%20that%20higher%20level%20of%20baseline%20privacy%20fatigue%20will%20reveal%20poorer%20privacy%20decision-making%20outcomes%2C%20which%20will%20be%20represented%20by%20more%20frequent%20acceptance%20of%20online%20cookies.%20Moreover%2C%20we%20predict%20that%20participants%5Cu2019%20pupil%20dilation%2C%20a%20physiological%20index%20of%20cognitive%20effort%2C%20will%20decrease%2C%20and%20privacy%20fatigue%20levels%20will%20increase%20over%20the%20course%20of%20the%20online%20decision-making%20task.%22%2C%22date%22%3A%222023-09-01%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1177%5C%2F21695067231193670%22%2C%22ISSN%22%3A%222169-5067%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F21695067231193670%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T18%3A43%3A06Z%22%7D%7D%2C%7B%22key%22%3A%22SAC92BG3%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Biondi%20et%20al.%22%2C%22parsedDate%22%3A%222023-09-01%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EBiondi%2C%20F.%20N.%2C%20Graf%2C%20F.%2C%20Pillai%2C%20P.%2C%20%26amp%3B%20Balasingam%2C%20B.%20%282023%29.%20On%20Validating%20a%20Generic%20Video-Based%20Blink%20Detection%20System%20for%20Cognitive%20Load%20Detection.%20%3Ci%3EProceedings%20of%20the%20Human%20Factors%20and%20Ergonomics%20Society%20Annual%20Meeting%3C%5C%2Fi%3E%2C%20%3Ci%3E67%3C%5C%2Fi%3E%281%29%2C%201425%26%23x2013%3B1430.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F21695067231192924%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F21695067231192924%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22On%20Validating%20a%20Generic%20Video-Based%20Blink%20Detection%20System%20for%20Cognitive%20Load%20Detection%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Francesco%20N.%22%2C%22lastName%22%3A%22Biondi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Frida%22%2C%22lastName%22%3A%22Graf%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Prarthana%22%2C%22lastName%22%3A%22Pillai%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Balakumar%22%2C%22lastName%22%3A%22Balasingam%22%7D%5D%2C%22abstractNote%22%3A%22This%20work%20aims%20to%20validate%20using%20a%20video-based%20blink%20detection%20system%20for%20detecting%20changes%20in%20cognitive%20load.%20Participants%20completed%20a%20cognitive%20task%20with%20increasing%20levels%20of%20difficulty.%20Blink%20rate%20was%20recorded%20via%20both%20our%20video-based%20system%20and%20a%20scientific-grade%20eye-tracker.%20Results%20showed%20no%20differences%20in%20the%20blink%20rates%20recorded%20by%20the%20two%20systems.%20However%2C%20while%20strong%20evidence%20was%20found%20that%20the%20blink%20rate%20recorded%20through%20the%20eye-tracker%20increased%20under%20greater%20cognitive%20task%20demand%2C%20no%20such%20pattern%20was%20observed%20for%20the%20output%20from%20the%20video-based%20system.%20Our%20findings%20show%20that%20the%20overall%20performance%20of%20the%20video-based%20system%20was%20comparable%20with%20that%20of%20the%20eye-tracker%27s%2C%20but%20its%20sensitivity%20in%20detecting%20changes%20in%20cognitive%20load%20was%20inferior%20to%20that%20of%20its%20counterpart%27s.%22%2C%22date%22%3A%222023-09-01%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1177%5C%2F21695067231192924%22%2C%22ISSN%22%3A%222169-5067%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F21695067231192924%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-03T18%3A42%3A16Z%22%7D%7D%2C%7B%22key%22%3A%22G2X49AB9%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hatzithomas%20et%20al.%22%2C%22parsedDate%22%3A%222023-09-01%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHatzithomas%2C%20L.%2C%20Theodorakioglou%2C%20F.%2C%20Margariti%2C%20K.%2C%20%26amp%3B%20Boutsouki%2C%20C.%20%282023%29.%20Cross-media%20advertising%20strategies%20and%20brand%20attitude%3A%20the%20role%20of%20cognitive%20load.%20%3Ci%3EInternational%20Journal%20of%20Advertising%3C%5C%2Fi%3E%2C%20%3Ci%3E0%3C%5C%2Fi%3E%280%29%2C%201%26%23x2013%3B33.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F02650487.2023.2249342%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F02650487.2023.2249342%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Cross-media%20advertising%20strategies%20and%20brand%20attitude%3A%20the%20role%20of%20cognitive%20load%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Leonidas%22%2C%22lastName%22%3A%22Hatzithomas%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Fotini%22%2C%22lastName%22%3A%22Theodorakioglou%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kostoula%22%2C%22lastName%22%3A%22Margariti%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Christina%22%2C%22lastName%22%3A%22Boutsouki%22%7D%5D%2C%22abstractNote%22%3A%22In%20recent%20years%2C%20cross-media%20advertising%20has%20received%20widespread%20attention%20from%20researchers%20and%20practitioners%20seeking%20effective%20ways%20to%20communicate%20with%20their%20audience.%20Building%20on%20Kahneman%5Cu2019s%20dual-system%20theory%2C%20the%20present%20article%20proposes%20a%20model%20of%20the%20impact%20of%20cross-media%20advertising%20on%20brand%20attitude%20%28Abr%29.%20An%20eye-tracking%20experiment%20with%2060%20participants%20indicates%20that%20simultaneous%20%28vs.%20sequential%29%20exposure%20to%20ads%20for%20the%20same%20brand%20on%20TV%20and%20the%20Internet%20increases%20cognitive%20load%20and%2C%20through%20subjective%20comprehension%2C%20decreases%20brand%20attitude.%20Two%20online%20experiments%20with%20395%20and%20198%20participants%20in%20a%20low-%20and%20high-involvement%20product%20category%2C%20respectively%2C%20validate%20the%20proposed%20model.%20Experiment%202%20reveals%20that%20in%20sequential%20exposure%20to%20TV%20and%20the%20internet%2C%20the%20fit%20between%20campaign%20ads%20further%20decreases%20the%20cognitive%20load%20leading%20to%20improved%20brand%20attitude.%20Experiment%203%20strongly%20suggests%20that%20in%20simultaneous%20exposure%2C%20synchronous%20%28vs.%20asynchronous%29%20ads%20reduce%20cognitive%20demands%20and%2C%20through%20subjective%20comprehension%20and%20TV%20ad%20engagement%2C%20improve%20brand%20attitude.%22%2C%22date%22%3A%222023-09-01%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1080%5C%2F02650487.2023.2249342%22%2C%22ISSN%22%3A%220265-0487%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F02650487.2023.2249342%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-09-12T22%3A06%3A02Z%22%7D%7D%2C%7B%22key%22%3A%22433KM64V%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Prahm%20et%20al.%22%2C%22parsedDate%22%3A%222023-09-01%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EPrahm%2C%20C.%2C%20Konieczny%2C%20J.%2C%20Bressler%2C%20M.%2C%20Heinzel%2C%20J.%2C%20Daigeler%2C%20A.%2C%20Kolbenschlag%2C%20J.%2C%20%26amp%3B%20Lauer%2C%20H.%20%282023%29.%20Influence%20of%20colored%20face%20masks%20on%20judgments%20of%20facial%20attractiveness%20and%20gaze%20patterns.%20%3Ci%3EActa%20Psychologica%3C%5C%2Fi%3E%2C%20%3Ci%3E239%3C%5C%2Fi%3E%2C%20103994.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.actpsy.2023.103994%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.actpsy.2023.103994%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Influence%20of%20colored%20face%20masks%20on%20judgments%20of%20facial%20attractiveness%20and%20gaze%20patterns%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Cosima%22%2C%22lastName%22%3A%22Prahm%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Julia%22%2C%22lastName%22%3A%22Konieczny%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%22%2C%22lastName%22%3A%22Bressler%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Johannes%22%2C%22lastName%22%3A%22Heinzel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Adrien%22%2C%22lastName%22%3A%22Daigeler%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jonas%22%2C%22lastName%22%3A%22Kolbenschlag%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Henrik%22%2C%22lastName%22%3A%22Lauer%22%7D%5D%2C%22abstractNote%22%3A%22Background%5CnFacial%20aesthetics%20are%20of%20great%20importance%20in%20social%20interaction.%20With%20the%20widespread%20adoption%20of%20face%20masks%20in%20response%20to%20the%20Covid-19%20pandemic%2C%20there%20is%20growing%20interest%20in%20understanding%20how%20wearing%20masks%20might%20impact%20perceptions%20of%20attractiveness%2C%20as%20they%20partially%20or%20completely%20conceal%20facial%20features%20that%20are%20typically%20associated%20with%20attractiveness.%5CnObjectives%5CnThis%20study%20aimed%20to%20explore%20the%20impact%20of%20mask%20wearing%20on%20attractiveness%20and%20to%20investigate%20whether%20the%20color%20%28red%20or%20blue%29%20of%20the%20mask%20has%20any%20effect%20on%20the%20perception%20of%20a%20person%27s%20attractiveness%2C%20while%20also%20considering%20gender%20and%20age%20as%20contributing%20factors.%20Additionally%2C%20the%20study%20intended%20to%20evaluate%20gaze%20patterns%2C%20initial%20focus%2C%20and%20dwell%20time%20in%20response%20to%20masked%20and%20unmasked%20faces.%5CnMethods%5Cn30%20AI-generated%20images%20of%2015%20female%20and%2015%20male%20faces%20were%20presented%20to%2071%20participants%20%2835%20male%2C%2036%20female%29%20in%203%20conditions%3A%20not%20wearing%20any%20mask%2C%20wearing%20a%20red%20surgical%20mask%2C%20and%20wearing%20a%20blue%20surgical%20mask.%20The%20perceived%20attractiveness%20was%20rated%20on%20an%20ordinal%20scale%20of%201%5Cu201310%20%2810%20being%20most%20attractive%29.%20Gaze%20behavior%2C%20dwell%20time%20and%20initial%20focus%20were%20recorded%20using%20a%20stationary%20eye-tracking%20system.%5CnResults%5CnThe%20study%20found%20that%20wearing%20masks%20had%20no%20significant%20effect%20on%20the%20attractiveness%20ratings%20of%20female%20faces%20%28p%5Cu00a0%3D%5Cu00a0.084%29%2C%20but%20it%20did%20benefit%20the%20perceived%20attractiveness%20of%20male%20faces%20which%20were%20initially%20rated%20lower%20%28p%5Cu00a0%3D%5Cu00a0.16%29.%20Gender%20and%20age%20also%20played%20a%20significant%20role%2C%20as%20both%20male%20and%20female%20participants%20rated%20female%20stimuli%20higher%20than%20male%20stimuli%20%28p%5Cu00a0%3C%5Cu00a0.001%29%2C%20and%20younger%20participants%20rated%20both%20genders%20as%20less%20attractive%20than%20older%20participants%20%28p%5Cu00a0%3C%5Cu00a0.01%29.%20However%2C%20there%20was%20no%20significant%20influence%20of%20the%20mask%27s%20color%20on%20attractiveness.%20During%20the%20eye-tracking%20analysis%2C%20the%20periorbital%20region%20was%20of%20greater%20interest%20while%20masked%2C%20with%20the%20time%20to%20first%20fixation%20for%20the%20eyes%20being%20lower%20than%20the%20non-masked%20stimulus%20%28p%5Cu00a0%3C%5Cu00a0.001%29%20and%20showed%20a%20longer%20dwell%20time%20%28p%5Cu00a0%3C%5Cu00a0.001%29.%20The%20lower%20face%20was%20shown%20less%20interest%20while%20masked%20as%20the%20time%20to%20first%20fixation%20was%20higher%20%28p%5Cu00a0%3C%5Cu00a0.001%29%20and%20the%20fixation%20count%20was%20less%20%28p%5Cu00a0%3C%5Cu00a0.001%29.%20Mask%20color%20did%20not%20influence%20the%20scan%20path%20and%20there%20was%20no%20difference%20in%20revisits%20to%20the%20mask%20area%20between%20red%20or%20blue%20masks%20%28p%5Cu00a0%3D%5Cu00a0.202%29%2C%20nor%20was%20there%20a%20difference%20in%20time%20to%20first%20fixation%20%28p%5Cu00a0%3D%5Cu00a0.660%29.%5CnConclusions%5CnThe%20study%20findings%20indicate%20that%20there%20is%20an%20interplay%20between%20the%20gender%20and%20age%20of%20the%20participant%20and%20the%20facial%20stimuli.%20The%20color%20red%20did%20have%20an%20effect%20on%20the%20perception%20attractiveness%2C%20however%20not%20in%20female%20faces.%20The%20results%20suggest%20that%20masks%2C%20especially%20red%20ones%2C%20might%20be%20more%20beneficial%20for%20male%20faces%2C%20which%20were%20perceived%20as%20less%20attractive%20without%20a%20mask.%20However%2C%20wearing%20a%20mask%20did%20not%20significantly%20impact%20already%20attractive%20faces.%20The%20eye-tracking%20results%20revealed%20that%20the%20periorbital%20region%20attracted%20more%20attention%20and%20was%20fixated%20on%20more%20quickly%20while%20wearing%20a%20mask%2C%20indicating%20the%20importance%20of%20eyes%20in%20social%20interaction%20and%20aesthetic%20perception.%22%2C%22date%22%3A%222023-09-01%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.actpsy.2023.103994%22%2C%22ISSN%22%3A%220001-6918%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS0001691823001701%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-09-12T18%3A01%3A44Z%22%7D%7D%2C%7B%22key%22%3A%22QXQ5S9LW%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Ja%5Cu015bkowiec%20and%20Kowalska-Chrzanowska%22%2C%22parsedDate%22%3A%222023-08-30%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EJa%26%23x15B%3Bkowiec%2C%20M.%2C%20%26amp%3B%20Kowalska-Chrzanowska%2C%20M.%20%282023%29.%20The%20Use%20of%20Games%20in%20Citizen%20Science%20Based%20on%20Findings%20from%20the%20EyeWire%20User%20Study.%20%3Ci%3EGames%20and%20Culture%3C%5C%2Fi%3E%2C%2015554120231196260.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F15554120231196260%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F15554120231196260%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22The%20Use%20of%20Games%20in%20Citizen%20Science%20Based%20on%20Findings%20from%20the%20EyeWire%20User%20Study%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mirela%22%2C%22lastName%22%3A%22Ja%5Cu015bkowiec%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ma%5Cu0142gorzata%22%2C%22lastName%22%3A%22Kowalska-Chrzanowska%22%7D%5D%2C%22abstractNote%22%3A%22The%20article%20addresses%20the%20use%20of%20games%20for%20citizen%20research.%20Following%20the%20results%20of%20the%20EyeWire%20user%20research%2C%20the%20authors%20attempt%20to%20answer%20the%20question%20of%20the%20impact%20of%20introductory%20game%20training%20on%20task%20performance%2C%20identify%20the%20areas%20with%20the%20most%20significant%20effect%20on%20participants%5Cu2019%20performance%2C%20and%20assess%20users%5Cu2019%20impressions%20and%20level%20of%20engagement%20in%20the%20proposed%20working%20model.%20A%20survey%20method%20was%20used%20to%20investigate%20user%20impressions.%20Fixation%20data%20were%20obtained%20from%20eye-tracking%20studies.%20The%20research%20shows%2C%20that%20users%20with%20experience%20with%20computer%20games%20do%20better%20in%20scientific%20discovery%20games.%20The%20main%20reasons%20to%20engage%20in%20this%20type%20of%20project%20are%20the%20need%20for%20learning%20development%20and%20self-development.%20The%20results%20indicate%20a%20significant%20cognitive%20strain%20on%20users%2C%20notably%20in%20the%20initial%20phase%20of%20solving%20tasks%20independently.%20It%20infers%20the%20conclusion%20that%20this%20should%20be%20considered%20when%20designing%20such%20programs%20and%20the%20pace%20of%20introducing%20the%20user%20to%20its%20functions%20should%20be%20adjusted.%22%2C%22date%22%3A%222023-08-30%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1177%5C%2F15554120231196260%22%2C%22ISSN%22%3A%221555-4120%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F15554120231196260%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-09-12T21%3A59%3A56Z%22%7D%7D%2C%7B%22key%22%3A%22GEPJ7LQL%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Rocca%20et%20al.%22%2C%22parsedDate%22%3A%222023-08-29%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ERocca%2C%20F.%2C%20Dave%2C%20M.%2C%20Duvivier%2C%20V.%2C%20Van%20Daele%2C%20A.%2C%20Demeuse%2C%20M.%2C%20Derobertmasure%2C%20A.%2C%20Mancas%2C%20M.%2C%20%26amp%3B%20Gosselin%2C%20B.%20%282023%29.%20Designing%20an%20Assistance%20Tool%20for%20Analyzing%20and%20Modeling%20Trainer%20Activity%20in%20Professional%20Training%20Through%20Simulation.%20%3Ci%3EProceedings%20of%20the%202023%20ACM%20International%20Conference%20on%20Interactive%20Media%20Experiences%3C%5C%2Fi%3E%2C%20180%26%23x2013%3B187.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3573381.3596475%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3573381.3596475%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Designing%20an%20Assistance%20Tool%20for%20Analyzing%20and%20Modeling%20Trainer%20Activity%20in%20Professional%20Training%20Through%20Simulation%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Francois%22%2C%22lastName%22%3A%22Rocca%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Madison%22%2C%22lastName%22%3A%22Dave%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Val%5Cu00e9rie%22%2C%22lastName%22%3A%22Duvivier%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Agn%5Cu00e8s%22%2C%22lastName%22%3A%22Van%20Daele%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marc%22%2C%22lastName%22%3A%22Demeuse%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Antoine%22%2C%22lastName%22%3A%22Derobertmasure%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Matei%22%2C%22lastName%22%3A%22Mancas%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bernard%22%2C%22lastName%22%3A%22Gosselin%22%7D%5D%2C%22abstractNote%22%3A%22Human%20audience%20analysis%20is%20crucial%20in%20numerous%20domains%20to%20understand%20people%5Cu2019s%20behavior%20based%20on%20their%20knowledge%20and%20environment.%20In%20this%20paper%20we%20focus%20on%20simulation-based%20training%20which%20has%20become%20a%20popular%20teaching%20approach%20that%20requires%20a%20trainer%20able%20to%20manage%20a%20lot%20of%20data%20at%20the%20same%20time.%20We%20present%20tools%20that%20are%20currently%20being%20developed%20to%20help%20trainers%20from%20two%20different%20fields%3A%20training%20for%20practical%20teaching%20gestures%20and%20training%20in%20civil%20defense.%20In%20this%20sense%2C%20three%20technological%20blocks%20are%20built%20to%20collect%20and%20analyze%20data%20about%20trainers%5Cu2019%20and%20learners%5Cu2019%20gestures%2C%20gaze%2C%20speech%20and%20movements.%20The%20paper%20also%20discusses%20the%20future%20work%20planned%20for%20this%20project%2C%20including%20the%20integration%20of%20the%20framework%20into%20the%20Noldus%20system%20and%20its%20use%20in%20civil%20security%20training.%20Overall%2C%20the%20article%20highlights%20the%20potential%20of%20technology%20to%20improve%20simulation-based%20training%20and%20provides%20a%20roadmap%20for%20future%20development.%22%2C%22date%22%3A%22August%2029%2C%202023%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%202023%20ACM%20International%20Conference%20on%20Interactive%20Media%20Experiences%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1145%5C%2F3573381.3596475%22%2C%22ISBN%22%3A%229798400700286%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3573381.3596475%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-23T18%3A42%3A39Z%22%7D%7D%2C%7B%22key%22%3A%22QRUCZBID%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Huang%20et%20al.%22%2C%22parsedDate%22%3A%222023-08-26%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHuang%2C%20J.%2C%20Raja%2C%20J.%2C%20Cantor%2C%20C.%2C%20Marx%2C%20W.%2C%20Galgano%2C%20S.%2C%20Zarzour%2C%20J.%2C%20Caridi%2C%20T.%2C%20Gunn%2C%20A.%2C%20Morgan%2C%20D.%2C%20%26amp%3B%20Smith%2C%20A.%20%282023%29.%20Eye%20Motion%20Tracking%20for%20Medical%20Image%20Interpretation%20Training.%20%3Ci%3ECurrent%20Problems%20in%20Diagnostic%20Radiology%3C%5C%2Fi%3E.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1067%5C%2Fj.cpradiol.2023.08.013%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1067%5C%2Fj.cpradiol.2023.08.013%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Eye%20Motion%20Tracking%20for%20Medical%20Image%20Interpretation%20Training%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Junjian%22%2C%22lastName%22%3A%22Huang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Junaid%22%2C%22lastName%22%3A%22Raja%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chanler%22%2C%22lastName%22%3A%22Cantor%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22William%22%2C%22lastName%22%3A%22Marx%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sam%22%2C%22lastName%22%3A%22Galgano%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jessica%22%2C%22lastName%22%3A%22Zarzour%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Theresa%22%2C%22lastName%22%3A%22Caridi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22AJ%22%2C%22lastName%22%3A%22Gunn%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Desiree%22%2C%22lastName%22%3A%22Morgan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Andrew%22%2C%22lastName%22%3A%22Smith%22%7D%5D%2C%22abstractNote%22%3A%22The%20significance%20of%20Eye%20Motion%20Tracking%20%28EMT%29%20in%20aiding%20learners%20in%20training%20search%20patterns%2C%20pattern%20recognition%2C%20and%20efficiently%20using%20their%20gaze%20in%20terms%20of%20time%20and%20scanning%20distribution%20has%20been%20highlighted%20in%20the%20USAF%20Pilot%20Training%20Next%20initiative.%20The%20innovation%20described%20further%20builds%20on%20this%20concept%20in%20the%20realm%20of%20medical%20imaging%20and%20the%20provision%20of%20real-time%20feedback%20of%20eye%20direction%20and%20gaze%20duration.%20This%20real-time%20indicator%20enables%20the%20trainer%20to%20adapt%20verbal%20queueing%20of%20the%20trainee%20in%20a%20personalized%20manner%20to%20improve%20knowledge%20transfer%2C%20and%20to%20increase%20the%20confidence%20of%20the%20trainer%20and%20trainee%20in%20the%20competency%20of%20the%20trainee.%20The%20initial%20experiment%20data%20set%20included%20bone%20radiographs%2C%20digital%20subtraction%20angiograms%2C%20and%20computed%20tomography%20images.%20Preliminary%20results%20and%20formative%20feedback%20from%20participants%20was%20encouraging%20with%20expert%20viewers%20able%20to%20use%20EMT%20to%20successfully%20guide%20novice%20readers%20through%20search%20and%20gaze%20protocol%20patterns%20of%20the%20medical%20images.%22%2C%22date%22%3A%222023-08-26%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1067%5C%2Fj.cpradiol.2023.08.013%22%2C%22ISSN%22%3A%220363-0188%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS0363018823001299%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-09-12T22%3A05%3A40Z%22%7D%7D%2C%7B%22key%22%3A%225SIJPB5Z%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Brancucci%20et%20al.%22%2C%22parsedDate%22%3A%222023-08-01%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EBrancucci%2C%20A.%2C%20Ferracci%2C%20S.%2C%20D%26%23x2019%3BAnselmo%2C%20A.%2C%20%26amp%3B%20Manippa%2C%20V.%20%282023%29.%20Hemispheric%20functional%20asymmetries%20and%20sex%20effects%20in%20visual%20bistable%20perception.%20%3Ci%3EConsciousness%20and%20Cognition%3C%5C%2Fi%3E%2C%20%3Ci%3E113%3C%5C%2Fi%3E%2C%20103551.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.concog.2023.103551%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.concog.2023.103551%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Hemispheric%20functional%20asymmetries%20and%20sex%20effects%20in%20visual%20bistable%20perception%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alfredo%22%2C%22lastName%22%3A%22Brancucci%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sara%22%2C%22lastName%22%3A%22Ferracci%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anita%22%2C%22lastName%22%3A%22D%27Anselmo%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Valerio%22%2C%22lastName%22%3A%22Manippa%22%7D%5D%2C%22abstractNote%22%3A%22This%20study%20investigates%20bistable%20perception%20as%20a%20function%20of%20the%20presentation%20side%20of%20the%20ambiguous%20figures%20and%20of%20participants%5Cu2019%20sex%2C%20to%20evaluate%20left%5Cu2013right%20hemispheric%20%28LH-RH%29%20asymmetries%20related%20to%20consciousness.%20In%20two%20experiments%20using%20the%20divided%20visual%20field%20paradigm%2C%20two%20Rubin%27s%20vase-faces%20figures%20were%20projected%20simultaneously%20and%20continuously%20180%5Cu00a0s%20long%20to%20the%20left%20%28LVF%29%20and%20right%20%28RVF%3B%20Experiment%201%29%20or%20to%20the%20upper%20%28UVF%29%20and%20lower%20%28DVF%3B%20Experiment%202%29%20visual%20hemifields%20of%2048%20healthy%20subjects%20monitored%20with%20eye-tracker.%20Experiment%201%20enables%20stimulus%20segregation%20from%20the%20LVF%20to%20the%20RH%20and%20from%20the%20RVF%20to%20the%20LH%2C%20whereas%20Experiment%202%20does%20not.%20Results%20from%20Experiment%201%20show%20that%20males%20perceived%20the%20face%20profiles%20for%20more%20time%20in%20the%20LVF%20than%20in%20the%20RVF%2C%20with%20an%20opposite%20trend%20for%20the%20vase%2C%20whereas%20females%20show%20a%20similar%20pattern%20of%20perception%20in%20the%20two%20hemifields.%20A%20related%20result%20confirmed%20the%20previously%20reported%20possibility%20to%20have%20simultaneously%20two%20different%20percepts%20%28qualia%29%20in%20the%20two%20hemifields%20elicited%20by%20the%20two%20identic%20ambiguous%20stimuli%2C%20which%20was%20here%20observed%20to%20occur%20more%20frequently%20in%20males.%20Similar%20effects%20were%20not%20observed%20in%20Experiment%202.%20These%20findings%20suggest%20that%20the%20percepts%20display%20the%20processing%20abilities%20of%20the%20hemisphere%20currently%20processing%20the%20stimulus%20eliciting%20them%20%28e.g.%2C%20RH-faces%29%2C%20and%20that%20females%20and%20males%20reflect%20in%20bistable%20perception%2C%20a%20genuine%20manifestation%20of%20consciousness%2C%20the%20well-known%20hemispheric%20asymmetry%20differences%20they%20show%20in%20ordinary%20perception.%22%2C%22date%22%3A%222023-08-01%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.concog.2023.103551%22%2C%22ISSN%22%3A%221053-8100%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS1053810023000880%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-23T18%3A49%3A26Z%22%7D%7D%2C%7B%22key%22%3A%22PEDCFRX9%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Lu%20et%20al.%22%2C%22parsedDate%22%3A%222023-08%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ELu%2C%20H.-Y.%2C%20Lin%2C%20Y.-C.%2C%20Chen%2C%20C.-H.%2C%20Wang%2C%20C.-C.%2C%20Han%2C%20I.-W.%2C%20%26amp%3B%20Liang%2C%20W.-L.%20%282023%29.%20Detecting%20Children%20with%20Autism%20Spectrum%20Disorder%20Based%20on%20Eye-tracking%20and%20Machine%20Learning.%20%3Ci%3E2023%20IEEE%206th%20International%20Conference%20on%20Knowledge%20Innovation%20and%20Invention%20%28ICKII%29%3C%5C%2Fi%3E%2C%20372%26%23x2013%3B375.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FICKII58656.2023.10332630%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FICKII58656.2023.10332630%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Detecting%20Children%20with%20Autism%20Spectrum%20Disorder%20Based%20on%20Eye-tracking%20and%20Machine%20Learning%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hsin-Yi%22%2C%22lastName%22%3A%22Lu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yang-Cheng%22%2C%22lastName%22%3A%22Lin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chia-Hsin%22%2C%22lastName%22%3A%22Chen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chih-Chung%22%2C%22lastName%22%3A%22Wang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ia-Wen%22%2C%22lastName%22%3A%22Han%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Wen-Lung%22%2C%22lastName%22%3A%22Liang%22%7D%5D%2C%22abstractNote%22%3A%22Autism%20Spectrum%20Disorder%20%28ASD%29%20is%20a%20neurodevelopmental%20disorder%20that%20appears%20in%20childhood%20with%20varying%20severity.%20Unfortunately%2C%20current%20screening%20methods%20for%20ASD%20often%20lack%20sensitivity.%20While%20clinical%20observation-based%20evaluations%20are%20more%20reliable%2C%20they%20are%20also%20time-consuming.%20An%20artificial%20intelligence%20%28AI%29%20model%20has%20the%20potential%20to%20analyze%20eye-tracking%20data%20and%20provide%20a%20more%20efficient%20and%20objective%20assessment%20of%20ASD%2C%20potentially%20reducing%20diagnosis%20time%20and%20improving%20accuracy.%20The%20model%20helps%20children%20with%20ASD%20access%20necessary%20medical%20resources%20more%20promptly.%20In%20the%20current%20study%2C%2058%20children%20aged%204%20to%206%20years%20were%20recruited.%20The%20children%20were%20presented%20with%20static%20images%20that%20were%20divided%20into%20five%20categories%20based%20on%20the%20number%20of%20individuals%20and%20interactions.%20Eye-tracking%20technology%20was%20used%20to%20record%20various%20metrics%2C%20including%20fixation%20count%2C%20duration%2C%20percentage%2C%20and%20time%20to%20the%20first%20fixation%20on%20areas%20of%20interest%20%28AOI%29%20during%20image%20viewing.%20Principal%20Component%20Analysis%20%28PCA%29%20was%20implemented%20for%20feature%20selection%2C%20and%20Synthetic%20Minority%20Over-sampling%20Technique%20was%20used%20to%20generate%20minority%20samples.%20Five%20classification%20algorithms%20%28Decision%20Tree%2C%20Random%20Forest%2C%20Logistic%20Regression%2C%20Extreme%20Gradient%20Boosting%2C%20and%20Support%20Vector%20Machine%29%20were%20applied%20to%20classify%20children%20with%20ASD%20and%20TD.%20Both%20groups%20of%20children%20completed%20calibration%20and%20testing%20after%20guidance%2C%20and%20no%20adverse%20reactions%20occurred%20during%20the%20testing%20process.%20The%20results%20showed%20that%20the%20classification%20models%20achieved%20an%20SVM%20accuracy%20of%2083.3%25%2C%20an%20AUC%20of%200.94%2C%20an%20LR%20accuracy%20of%2083.3%25%2C%20an%20AUC%20of%200.95%2C%20a%20DT%20accuracy%20of%2077.8%25%2C%20an%20AUC%20of%200.81%2C%20an%20RF%20accuracy%20of%2088.9%25%2C%20and%20an%20AUC%20of%200.96%2C%20as%20well%20as%20an%20XGBoost%20accuracy%20of%2094.4%25%20and%20an%20AUC%20of%200.99%20for%20selected%2010%20features.%20Using%20machine%20learning%20methods%20to%20analyze%20eye-tracking%20data%20was%20a%20useful%20tool%20for%20identifying%20children%20with%20autism%20and%20may%20aid%20in%20the%20diagnostic%20process.%22%2C%22date%22%3A%222023-08%22%2C%22proceedingsTitle%22%3A%222023%20IEEE%206th%20International%20Conference%20on%20Knowledge%20Innovation%20and%20Invention%20%28ICKII%29%22%2C%22conferenceName%22%3A%222023%20IEEE%206th%20International%20Conference%20on%20Knowledge%20Innovation%20and%20Invention%20%28ICKII%29%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FICKII58656.2023.10332630%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fabstract%5C%2Fdocument%5C%2F10332630%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-23T01%3A24%3A23Z%22%7D%7D%2C%7B%22key%22%3A%22JWNTUZ73%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Abeysinghe%20et%20al.%22%2C%22parsedDate%22%3A%222023-08%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EAbeysinghe%2C%20Y.%2C%20Mahanama%2C%20B.%2C%20Jayawardena%2C%20G.%2C%20Sunkara%2C%20M.%2C%20Ashok%2C%20V.%2C%20%26amp%3B%20Jayarathna%2C%20S.%20%282023%29.%20Gaze%20Analytics%20Dashboard%20for%20Distributed%20Eye%20Tracking.%20%3Ci%3E2023%20IEEE%2024th%20International%20Conference%20on%20Information%20Reuse%20and%20Integration%20for%20Data%20Science%20%28IRI%29%3C%5C%2Fi%3E%2C%20140%26%23x2013%3B145.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FIRI58017.2023.00031%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FIRI58017.2023.00031%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Gaze%20Analytics%20Dashboard%20for%20Distributed%20Eye%20Tracking%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yasasi%22%2C%22lastName%22%3A%22Abeysinghe%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bhanuka%22%2C%22lastName%22%3A%22Mahanama%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gavindya%22%2C%22lastName%22%3A%22Jayawardena%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mohan%22%2C%22lastName%22%3A%22Sunkara%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Vikas%22%2C%22lastName%22%3A%22Ashok%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sampath%22%2C%22lastName%22%3A%22Jayarathna%22%7D%5D%2C%22abstractNote%22%3A%22Understanding%20the%20focus%20and%20visual%20scanning%20behavior%20of%20users%20during%20a%20collaborative%20activity%20in%20a%20distributed%20environment%20can%20be%20helpful%20in%20improving%20users%5Cu2019%20engagement.%20Eye%20tracking%20measures%20can%20provide%20informative%20cues%20to%20understanding%20human%20visual%20search%20behavior.%20In%20this%20study%2C%20we%20present%20a%20distributed%20eye-tracking%20system%20with%20a%20gaze%20analytics%20dashboard.%20This%20system%20extracts%20eye%20movements%20from%20multiple%20participants%20utilizing%20common%20off-the-shelf%20eye%20trackers%2C%20generates%20real-time%20traditional%20positional%20gaze%20measures%20and%20advanced%20gaze%20measures%20such%20as%20ambient-focal%20coefficient%20%5C%5CmathcalK%2C%20and%20displays%20them%20in%20an%20interactive%20dashboard.%20We%20evaluate%20the%20proposed%20methodology%20by%20developing%20a%20gaze%20analytics%20dashboard%20and%20conducting%20a%20pilot%20study%20to%20%281%29%20investigate%20the%20relationship%20between%20%5C%5CmathcalK%20with%20collaborative%20behavior%2C%20and%20%282%29%20compare%20it%20against%20the%20User%20Experience%20Questionnaire%20%28UEQ%29%20benchmark.%20Our%20results%20show%20that%20groups%20that%20spent%20more%20time%20had%20more%20ambient%20attention%2C%20and%20our%20dashboard%20has%20a%20higher%20overall%20impression%20compared%20to%20the%20UEQ%20benchmark.%22%2C%22date%22%3A%222023-08%22%2C%22proceedingsTitle%22%3A%222023%20IEEE%2024th%20International%20Conference%20on%20Information%20Reuse%20and%20Integration%20for%20Data%20Science%20%28IRI%29%22%2C%22conferenceName%22%3A%222023%20IEEE%2024th%20International%20Conference%20on%20Information%20Reuse%20and%20Integration%20for%20Data%20Science%20%28IRI%29%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FIRI58017.2023.00031%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-09-12T22%3A04%3A26Z%22%7D%7D%2C%7B%22key%22%3A%22UJF9ULUA%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hong%20et%20al.%22%2C%22parsedDate%22%3A%222023-07-24%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHong%2C%20W.%20C.%20H.%2C%20Ngan%2C%20H.%20F.%20B.%2C%20Yu%2C%20J.%2C%20%26amp%3B%20Arbouw%2C%20P.%20%282023%29.%20Examining%20cultural%20differences%20in%20Airbnb%20naming%20convention%20and%20user%20reception%3A%20an%20eye-tracking%20study.%20%3Ci%3EJournal%20of%20Travel%20%26amp%3B%20Tourism%20Marketing%3C%5C%2Fi%3E%2C%20%3Ci%3E40%3C%5C%2Fi%3E%286%29%2C%20475%26%23x2013%3B489.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F10548408.2023.2263764%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F10548408.2023.2263764%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Examining%20cultural%20differences%20in%20Airbnb%20naming%20convention%20and%20user%20reception%3A%20an%20eye-tracking%20study%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Wilson%20Cheong%20Hin%22%2C%22lastName%22%3A%22Hong%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Henrique%20F%5Cu00e1tima%20Boyol%22%2C%22lastName%22%3A%22Ngan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Joanne%22%2C%22lastName%22%3A%22Yu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Paula%22%2C%22lastName%22%3A%22Arbouw%22%7D%5D%2C%22abstractNote%22%3A%22Listing%20titles%20significantly%20impact%20consumer%20behaviors%20on%20shared-economy%20platforms%20%28e.g.%20Airbnb%29.%20Due%20to%20limited%20research%20on%20the%20length%2C%20informativeness%2C%20and%20creativity%20of%20titles%20in%20this%20context%2C%20this%20research%20investigates%20consumers%5Cu2019%20responses%20and%20perceptions%20towards%20various%20types%20of%20listing%20titles%20by%20exploring%20their%20cognitive%20processes%20and%20conscious%20preferences.%20In%20a%20three-phase%20study%20design%2C%20this%20research%20analyzed%20Airbnb%20listing%20titles%20in%20China%20and%20America%2C%20conducted%20eye-tracking%20experiments%2C%20and%20collected%20surveys%20to%20evaluate%20the%20effects%20of%20title%20length%20and%20informativeness.%20The%20results%20revealed%20cultural%20differences%20in%20titles%2C%20but%20similar%20behaviors%20among%20Chinese%20and%20English-speaking%20consumers.%20Cultural%20assumptions%2C%20processing%2C%20and%20informativeness%20were%20discussed.%22%2C%22date%22%3A%222023-07-24%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1080%5C%2F10548408.2023.2263764%22%2C%22ISSN%22%3A%221054-8408%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F10548408.2023.2263764%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-23T23%3A13%3A45Z%22%7D%7D%2C%7B%22key%22%3A%22SSWTR3V2%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22parsedDate%22%3A%222023-07-23%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Ci%3E%28PDF%29%20Eye%20Tracking%20as%20a%20Research%20and%20Training%20Tool%20for%20Ensuring%20Quality%20Education%3C%5C%2Fi%3E.%20%282023%2C%20July%2023%29.%20ResearchGate.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2F978-3-031-30498-9_28%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22webpage%22%2C%22title%22%3A%22%28PDF%29%20Eye%20Tracking%20as%20a%20Research%20and%20Training%20Tool%20for%20Ensuring%20Quality%20Education%22%2C%22creators%22%3A%5B%5D%2C%22abstractNote%22%3A%22PDF%20%7C%20According%20to%20the%20Sustainable%20Development%20Goal%20%5Cu201cEnsuring%20quality%20education%20for%20all%20is%20fundamental%20to%20creating%20peace%20and%20prosperity%20in%20the%20world%5Cu201d%2C...%20%7C%20Find%2C%20read%20and%20cite%20all%20the%20research%20you%20need%20on%20ResearchGate%22%2C%22date%22%3A%222023-07-23%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.researchgate.net%5C%2Fpublication%5C%2F372197481_Eye_Tracking_as_a_Research_and_Training_Tool_for_Ensuring_Quality_Education%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-09-11T23%3A22%3A02Z%22%7D%7D%2C%7B%22key%22%3A%22E7DACIAQ%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Han%22%2C%22parsedDate%22%3A%222023-07-01%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHan%2C%20E.%20%282023%29.%20Comparing%20the%20Perception%20of%20In-Person%20and%20Digital%20Monitor%20Viewing%20of%20Paintings.%20%3Ci%3EEmpirical%20Studies%20of%20the%20Arts%3C%5C%2Fi%3E%2C%20%3Ci%3E41%3C%5C%2Fi%3E%282%29%2C%20465%26%23x2013%3B496.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F02762374231158520%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F02762374231158520%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Comparing%20the%20Perception%20of%20In-Person%20and%20Digital%20Monitor%20Viewing%20of%20Paintings%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eugene%22%2C%22lastName%22%3A%22Han%22%7D%5D%2C%22abstractNote%22%3A%22In%20the%20context%20of%20rapidly%20developing%20technologies%20and%20widespread%20online%20access%2C%20it%20is%20important%20to%20understand%20how%20our%20perception%20of%20images%20on%20a%20computer%20screen%20may%20vary%20from%20traditional%20in-person%20encounters.%20This%20research%20compared%20the%20perception%20of%20subjects%20in%20view%20of%20eight%20paintings%2C%20presented%20either%20on%20a%20computer%20monitor%20or%20as%20a%20printed%20reproduction.%20Both%20stationary%20and%20mobile%20eye-tracking%20technologies%20were%20used%20to%20analyze%20the%20viewing%20patterns%20of%20both%20forms%20of%20engagement.%20Results%20suggested%20that%20subjects%20engaging%20with%20physical%20works%20tended%20to%20exhibit%20more%20varied%20fixational%20patterns%20than%20those%20viewing%20the%20same%20works%20on%20a%20computer%20monitor.%20Data%20showed%20parity%20in%20the%20high%20degree%20of%20correlation%20between%20viewing%20times%20and%20personal%20preference%2C%20regardless%20of%20viewing%20medium.%20The%20results%20indicate%20that%20the%20modalities%20through%20which%20we%20engage%20with%20works%20of%20art%20matter%2C%20and%20that%20a%20single%20image%20can%20resonate%20across%20an%20array%20of%20media.%22%2C%22date%22%3A%222023-07-01%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1177%5C%2F02762374231158520%22%2C%22ISSN%22%3A%220276-2374%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F02762374231158520%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-06-22T02%3A41%3A49Z%22%7D%7D%2C%7B%22key%22%3A%22ZKV8YSUM%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Koutsogiorgi%20and%20Michaelides%22%2C%22parsedDate%22%3A%222023-07%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EKoutsogiorgi%2C%20C.%20C.%2C%20%26amp%3B%20Michaelides%2C%20M.%20P.%20%282023%29.%20Response%20Tendencies%20to%20Positively%20and%20Negatively%20Worded%20Items%20of%20the%26%23xA0%3B%20%26%23xA0%3B%20%26%23xA0%3B%20Rosenberg%20Self-Esteem%20Scale%20With%20Eye-Tracking%20Methodology.%20%3Ci%3EEuropean%20Journal%20of%20Psychological%20Assessment%3C%5C%2Fi%3E%2C%20%3Ci%3E39%3C%5C%2Fi%3E%284%29%2C%20307%26%23x2013%3B315.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1027%5C%2F1015-5759%5C%2Fa000772%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1027%5C%2F1015-5759%5C%2Fa000772%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Response%20Tendencies%20to%20Positively%20and%20Negatively%20Worded%20Items%20of%20the%20%5Ct%5Ct%5Ct%5Ct%5CtRosenberg%20Self-Esteem%20Scale%20With%20Eye-Tracking%20Methodology%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chrystalla%20C.%22%2C%22lastName%22%3A%22Koutsogiorgi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michalis%20P.%22%2C%22lastName%22%3A%22Michaelides%22%7D%5D%2C%22abstractNote%22%3A%22%3A%20The%20Rosenberg%20Self-Esteem%20Scale%20%28RSES%29%20was%20developed%20as%20a%20unitary%20scale%20to%20assess%20attitudes%20toward%20the%20self.%20Previous%20studies%20have%20shown%20differences%20in%20responses%20and%20psychometric%20indices%20between%20the%20positively%20and%20negatively%20worded%20items%2C%20suggesting%20differential%20processing%20of%20responses.%20The%20current%20study%20examined%20differences%20in%20response%20behaviors%20toward%20two%20positively%20and%20two%20negatively%20worded%20items%20of%20the%20RSES%20with%20eye-tracking%20methodology%20and%20explored%20whether%20those%20differences%20were%20more%20pronounced%20among%20individuals%20with%20higher%20neuroticism%2C%20controlling%20for%20verbal%20abilities%20and%20mood.%20Eighty-seven%20university%20students%20completed%20a%20computerized%20version%20of%20the%20scale%2C%20while%20their%20responses%2C%20response%20time%2C%20and%20eye%20movements%20were%20recorded%20through%20the%20Gazepoint%20GP3%20HD%20eye-tracker.%20In%20linear%20mixed-effects%20models%2C%20two%20negatively%20worded%20items%20elicited%20higher%20scores%20%28elicited%20stronger%20disagreement%29%20in%20self-esteem%2C%20and%20different%20response%20processes%2C%20for%20example%2C%20longer%20viewing%20times%2C%20than%20two%20positively%20worded%20items.%20Neuroticism%20predicted%20lower%20responses%20and%20more%20revisits%20to%20item%20statements.%20Eye-tracking%20can%20enhance%20the%20examination%20of%20response%20tendencies%20and%20the%20role%20of%20wording%20and%20its%20interaction%20with%20individual%20characteristics%20at%20different%20stages%20of%20the%20response%20process.%22%2C%22date%22%3A%222023-07%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1027%5C%2F1015-5759%5C%2Fa000772%22%2C%22ISSN%22%3A%221015-5759%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fecontent.hogrefe.com%5C%2Fdoi%5C%2Fabs%5C%2F10.1027%5C%2F1015-5759%5C%2Fa000772%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-09-12T21%3A32%3A57Z%22%7D%7D%2C%7B%22key%22%3A%22DXIPUX86%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Foroughi%20et%20al.%22%2C%22parsedDate%22%3A%222023-06-01%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EForoughi%2C%20C.%20K.%2C%20Devlin%2C%20S.%2C%20Pak%2C%20R.%2C%20Brown%2C%20N.%20L.%2C%20Sibley%2C%20C.%2C%20%26amp%3B%20Coyne%2C%20J.%20T.%20%282023%29.%20Near-Perfect%20Automation%3A%20Investigating%20Performance%2C%20Trust%2C%20and%20Visual%20Attention%20Allocation.%20%3Ci%3EHuman%20Factors%3C%5C%2Fi%3E%2C%20%3Ci%3E65%3C%5C%2Fi%3E%284%29%2C%20546%26%23x2013%3B561.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F00187208211032889%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F00187208211032889%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Near-Perfect%20Automation%3A%20Investigating%20Performance%2C%20Trust%2C%20and%20Visual%20Attention%20Allocation%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Cyrus%20K.%22%2C%22lastName%22%3A%22Foroughi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shannon%22%2C%22lastName%22%3A%22Devlin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Richard%22%2C%22lastName%22%3A%22Pak%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Noelle%20L.%22%2C%22lastName%22%3A%22Brown%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ciara%22%2C%22lastName%22%3A%22Sibley%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Joseph%20T.%22%2C%22lastName%22%3A%22Coyne%22%7D%5D%2C%22abstractNote%22%3A%22Objective%5CnAssess%20performance%2C%20trust%2C%20and%20visual%20attention%20during%20the%20monitoring%20of%20a%20near-perfect%20automated%20system.%5CnBackground%5CnResearch%20rarely%20attempts%20to%20assess%20performance%2C%20trust%2C%20and%20visual%20attention%20in%20near-perfect%20automated%20systems%20even%20though%20they%20will%20be%20relied%20on%20in%20high-stakes%20environments.%5CnMethods%5CnSeventy-three%20participants%20completed%20a%2040-min%20supervisory%20control%20task%20where%20they%20monitored%20three%20search%20feeds.%20All%20search%20feeds%20were%20100%25%20reliable%20with%20the%20exception%20of%20two%20automation%20failures%3A%20one%20miss%20and%20one%20false%20alarm.%20Eye-tracking%20and%20subjective%20trust%20data%20were%20collected.%5CnResults%5CnThirty-four%20percent%20of%20participants%20correctly%20identified%20the%20automation%20miss%2C%20and%2067%25%20correctly%20identified%20the%20automation%20false%20alarm.%20Subjective%20trust%20increased%20when%20participants%20did%20not%20detect%20the%20automation%20failures%20and%20decreased%20when%20they%20did.%20Participants%20who%20detected%20the%20false%20alarm%20had%20a%20more%20complex%20scan%20pattern%20in%20the%202%20min%20centered%20around%20the%20automation%20failure%20compared%20with%20those%20who%20did%20not.%20Additionally%2C%20those%20who%20detected%20the%20failures%20had%20longer%20dwell%20times%20in%20and%20transitioned%20to%20the%20center%20sensor%20feed%20significantly%20more%20often.%5CnConclusion%5CnNot%20only%20does%20this%20work%20highlight%20the%20limitations%20of%20the%20human%20when%20monitoring%20near-perfect%20automated%20systems%2C%20it%20begins%20to%20quantify%20the%20subjective%20experience%20and%20attentional%20cost%20of%20the%20human.%20It%20further%20emphasizes%20the%20need%20to%20%281%29%20reevaluate%20the%20role%20of%20the%20operator%20in%20future%20high-stakes%20environments%20and%20%282%29%20understand%20the%20human%20on%20an%20individual%20level%20and%20actively%20design%20for%20the%20given%20individual%20when%20working%20with%20near-perfect%20automated%20systems.%5CnApplication%5CnMultiple%20operator-level%20measures%20should%20be%20collected%20in%20real-time%20in%20order%20to%20monitor%20an%20operator%5Cu2019s%20state%20and%20leverage%20real-time%2C%20individualized%20assistance.%22%2C%22date%22%3A%222023-06-01%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1177%5C%2F00187208211032889%22%2C%22ISSN%22%3A%220018-7208%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F00187208211032889%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-23T01%3A35%3A45Z%22%7D%7D%2C%7B%22key%22%3A%22ZAXZVKA9%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Pillai%20et%20al.%22%2C%22parsedDate%22%3A%222023-06%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EPillai%2C%20P.%2C%20Balasingam%2C%20B.%2C%20%26amp%3B%20Biondi%2C%20F.%20N.%20%282023%29.%20Model-Based%20Estimation%20of%20Mental%20Workload%20in%20Drivers%20Using%20Pupil%20Size%20Measurements.%20%3Ci%3E2023%20IEEE%5C%2FASME%20International%20Conference%20on%20Advanced%20Intelligent%20Mechatronics%20%28AIM%29%3C%5C%2Fi%3E%2C%20815%26%23x2013%3B821.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FAIM46323.2023.10196230%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FAIM46323.2023.10196230%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Model-Based%20Estimation%20of%20Mental%20Workload%20in%20Drivers%20Using%20Pupil%20Size%20Measurements%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Prarthana%22%2C%22lastName%22%3A%22Pillai%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Balakumar%22%2C%22lastName%22%3A%22Balasingam%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Francesco%20N.%22%2C%22lastName%22%3A%22Biondi%22%7D%5D%2C%22abstractNote%22%3A%22Passenger%20vehicles%20are%20increasingly%20adopting%20the%20use%20of%20automated%20driving%20systems%20%28ADS%29%20to%20help%20ease%20the%20workload%20of%20drivers%20and%20to%20improve%20road%20safety.%20These%20systems%20require%20human%20drivers%20to%20constantly%20maintain%20supervisory%20control%20of%20the%20ADS.%20For%20safe%20adoption%20and%20ADS%2C%20the%20attention%20or%20alertness%20of%20the%20driver%20needs%20to%20be%20continuously%20monitored.%20Past%20studies%20have%20demonstrated%20pupil%20dilation%20as%20an%20effective%20measure%20of%20cognitive%20load.%20However%2C%20the%20raw%20pupil%20data%20recorded%20using%20eye%20trackers%20are%20noisy%20which%20may%20result%20in%20poor%20classification%20of%20the%20cognitive%20load%20levels%20of%20the%20driver.%20In%20this%20paper%2C%20an%20approach%20to%20reduce%20the%20noise%20raw%20pupil%20size%20data%20obtained%20from%20eye%20trackers%20used%20by%20ADS%20is%20proposed.%20The%20proposed%20approach%20uses%20a%20Kalman%20filter%20to%20filter%20out%20high-frequency%20noise%20that%20arises%20due%20to%20sudden%20changes%20in%20ambient%20light%2C%20head%5C%2Fbody%20movement%2C%20and%20measurement%20noise.%20Data%20collected%20from%2016%20participants%20were%20used%20to%20demonstrate%20the%20performance%20of%20the%20model-based%20pupil-size%20filtering%20approach%20presented%20in%20this%20paper.%20Results%20show%20an%20objective%20improvement%20in%20the%20potential%20to%20distinguish%20changes%20in%20pupil%20size%20due%20to%20various%20levels%20of%20cognitive%20workload%20experienced%20by%20participants.%22%2C%22date%22%3A%222023-06%22%2C%22proceedingsTitle%22%3A%222023%20IEEE%5C%2FASME%20International%20Conference%20on%20Advanced%20Intelligent%20Mechatronics%20%28AIM%29%22%2C%22conferenceName%22%3A%222023%20IEEE%5C%2FASME%20International%20Conference%20on%20Advanced%20Intelligent%20Mechatronics%20%28AIM%29%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FAIM46323.2023.10196230%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-09-12T18%3A02%3A06Z%22%7D%7D%2C%7B%22key%22%3A%228L6YCUQ6%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Warchol-Jakubowska%20et%20al.%22%2C%22parsedDate%22%3A%222023-05-30%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EWarchol-Jakubowska%2C%20A.%2C%20Krejtz%2C%20I.%2C%20%26amp%3B%20Krejtz%2C%20K.%20%282023%29.%20An%20irrelevant%20look%20of%20novice%20tram%20driver%3A%20Visual%20attention%20distribution%20of%20novice%20and%20expert%20tram%20drivers.%20%3Ci%3EProceedings%20of%20the%202023%20Symposium%20on%20Eye%20Tracking%20Research%20and%20Applications%3C%5C%2Fi%3E%2C%201%26%23x2013%3B3.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3588015.3589514%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3588015.3589514%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22An%20irrelevant%20look%20of%20novice%20tram%20driver%3A%20Visual%20attention%20distribution%20of%20novice%20and%20expert%20tram%20drivers%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anna%22%2C%22lastName%22%3A%22Warchol-Jakubowska%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Izabela%22%2C%22lastName%22%3A%22Krejtz%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Krzysztof%22%2C%22lastName%22%3A%22Krejtz%22%7D%5D%2C%22abstractNote%22%3A%22The%20present%20study%20explores%20differences%20in%20attention%20distribution%20of%20tram%20drivers%20with%20different%20expertise%20while%20watching%20tram%20driving%20simulations.%20Forty-seven%20participants%20participated%20in%20this%20experiment%20in%20two%20groups%20%2823%20experts%20and%2024%20novices%29.%20The%20results%20show%20between-group%20differences%20in%20attention%20dynamics.%20In%20line%20with%20prediction%2C%20the%20novices%20concentrated%20more%20on%20the%20middle%20panel%20of%20the%20tram%20simulator%20related%20to%20speed%20control%20than%20the%20experts.%20The%20study%20is%20the%20first%20step%20in%20designing%20gaze-based%20training%20for%20novice%20tram%20drivers.%22%2C%22date%22%3A%22May%2030%2C%202023%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%202023%20Symposium%20on%20Eye%20Tracking%20Research%20and%20Applications%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1145%5C%2F3588015.3589514%22%2C%22ISBN%22%3A%229798400701504%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fdoi%5C%2F10.1145%5C%2F3588015.3589514%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-23T18%3A47%3A59Z%22%7D%7D%2C%7B%22key%22%3A%222WNC8WV7%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Moreira%20et%20al.%22%2C%22parsedDate%22%3A%222023-05-30%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EMoreira%2C%20C.%2C%20Alvito%2C%20D.%20M.%2C%20Sousa%2C%20S.%20C.%2C%20Nobre%2C%20I.%20M.%20G.%20B.%2C%20Ouyang%2C%20C.%2C%20Kopper%2C%20R.%2C%20Duchowski%2C%20A.%2C%20%26amp%3B%20Jorge%2C%20J.%20%282023%29.%20Comparing%20Visual%20Search%20Patterns%20in%20Chest%20X-Ray%20Diagnostics.%20%3Ci%3EProceedings%20of%20the%202023%20Symposium%20on%20Eye%20Tracking%20Research%20and%20Applications%3C%5C%2Fi%3E%2C%201%26%23x2013%3B6.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3588015.3588403%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3588015.3588403%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Comparing%20Visual%20Search%20Patterns%20in%20Chest%20X-Ray%20Diagnostics%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Catarina%22%2C%22lastName%22%3A%22Moreira%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Diogo%20Miguel%22%2C%22lastName%22%3A%22Alvito%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sandra%20Costa%22%2C%22lastName%22%3A%22Sousa%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Isabel%20Maria%20Gomes%20Blanco%22%2C%22lastName%22%3A%22Nobre%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chun%22%2C%22lastName%22%3A%22Ouyang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Regis%22%2C%22lastName%22%3A%22Kopper%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Andrew%22%2C%22lastName%22%3A%22Duchowski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Joaquim%22%2C%22lastName%22%3A%22Jorge%22%7D%5D%2C%22abstractNote%22%3A%22Radiologists%20are%20trained%20professionals%20who%20use%20medical%20images%20to%20obtain%20clinically%20relevant%20information.%20However%2C%20little%20is%20known%20about%20visual%20search%20patterns%20and%20strategies%20radiologists%20employ%20during%20medical%20image%20analysis.%20Thus%2C%20there%20is%20a%20current%20need%20for%20guidelines%20to%20specify%20optimal%20visual%20search%20routines%20commonly%20used%20by%20radiologists.%20Identifying%20these%20features%20could%20improve%20radiologist%20training%20and%20assist%20radiologists%20in%20their%20work.%20Our%20study%20found%20that%20during%20the%20moments%20in%20which%20radiologists%20view%20chest%20X-ray%20images%20in%20silence%20before%20verbalizing%20the%20analysis%2C%20they%20exhibit%20unique%20search%20patterns%20regardless%20of%20the%20type%20of%20disease%20depicted.%20Our%20findings%20suggest%20that%20radiologists%5Cu2019%20search%20behaviors%20can%20be%20identified%20at%20this%20stage.%20However%2C%20when%20radiologists%20verbally%20interpret%20the%20X-rays%2C%20the%20gaze%20patterns%20appear%20noisy%20and%20arbitrary.%20Current%20deep-learning%20approaches%20train%20their%20systems%20using%20this%20noisy%20and%20arbitrary%20gaze%20data.%20This%20may%20explain%20why%20previous%20research%20still%20needs%20to%20show%20the%20superiority%20of%20deep-learning%20models%20that%20use%20eye%20tracking%20for%20disease%20classification.%20Our%20paper%20investigates%20these%20patterns%20and%20attempts%20to%20uncover%20the%20eye-gaze%20configurations%20during%20the%20different%20analysis%20phases.%22%2C%22date%22%3A%22May%2030%2C%202023%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%202023%20Symposium%20on%20Eye%20Tracking%20Research%20and%20Applications%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1145%5C%2F3588015.3588403%22%2C%22ISBN%22%3A%229798400701504%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fdoi%5C%2F10.1145%5C%2F3588015.3588403%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-23T01%3A34%3A36Z%22%7D%7D%2C%7B%22key%22%3A%224VCAA8NW%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hahn%20et%20al.%22%2C%22parsedDate%22%3A%222023-05-20%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHahn%2C%20A.%2C%20Riedelsheimer%2C%20J.%2C%20Royer%2C%20Z.%2C%20Frederick%2C%20J.%2C%20Kee%2C%20R.%2C%20Crimmins%2C%20R.%2C%20Huber%2C%20B.%2C%20Harris%2C%20D.%2C%20%26amp%3B%20Jantzen%2C%20K.%20%282023%29.%20%3Ci%3EEffects%20of%20Cleft%20Lip%20on%20Visual%20Scanning%20and%20Neural%20Processing%20of%20Infant%20Faces%3C%5C%2Fi%3E%20%5BPreprint%5D.%20Preprints.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.22541%5C%2Fau.168455102.24287447%5C%2Fv1%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22report%22%2C%22title%22%3A%22Effects%20of%20Cleft%20Lip%20on%20Visual%20Scanning%20and%20Neural%20Processing%20of%20Infant%20Faces%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Amanda%22%2C%22lastName%22%3A%22Hahn%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Juergen%22%2C%22lastName%22%3A%22Riedelsheimer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Zo%5Cu00eb%22%2C%22lastName%22%3A%22Royer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jeffrey%22%2C%22lastName%22%3A%22Frederick%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rachael%22%2C%22lastName%22%3A%22Kee%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rhiannon%22%2C%22lastName%22%3A%22Crimmins%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bernd%22%2C%22lastName%22%3A%22Huber%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22David%22%2C%22lastName%22%3A%22Harris%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kelly%22%2C%22lastName%22%3A%22Jantzen%22%7D%5D%2C%22abstractNote%22%3A%22Infant%20faces%20readily%20capture%20adult%20attention%20and%20elicit%20enhanced%20neural%5Cnprocessing%2C%20likely%20due%20to%20their%20importance%20evolutionarily%20in%5Cnfacilitating%20bonds%20with%20caregivers.%20Facial%20malformations%20have%20been%20shown%5Cnto%20impact%20early%20infant-caregiver%20interactions%20negatively.%20However%2C%20it%5Cnremains%20unclear%20how%20such%20facial%20malformations%20may%20impact%20early%20visual%5Cnprocessing.%20The%20current%20study%20used%20a%20combination%20of%20eye%20tracking%20and%5Cnelectroencephalography%20%28EEG%29%20to%20investigate%20adults%5Cu2019%20early%20visual%5Cnprocessing%20of%20infant%20faces%20with%20cleft%20lip%5C%2Fpalate%20as%20compared%20to%20normal%5Cninfant%20faces%2C%20as%20well%20as%20the%20impact%20cleft%20palate%20has%20on%20perceived%5Cncuteness.%20The%20results%20demonstrate%20a%20significant%20decrease%20in%20early%20visual%5Cnattention%20to%20the%20eye%20region%20for%20infants%20with%20cleft%20palate%2C%20while%5Cnincreased%20visual%20attention%20is%20registered%20on%20the%20mouth%20region.%20Increased%5Cnneural%20processing%20of%20the%20cleft%20palate%20was%20evident%20at%20the%20N170%20and%20LPP%2C%5Cnsuggesting%20differences%20in%20configural%20processing%20and%20affective%20responses%5Cnto%20the%20faces.%20Infants%20with%20cleft%20palate%20were%20also%20rated%20significantly%5Cnless%20cute%20than%20their%20healthy%20counterparts.%20These%20results%20suggest%20that%5Cninfants%5Cu2019%20faces%20suffering%20from%20cleft%20lip%5C%2Fpalate%20are%20processed%20differently%5Cnat%20early%20visual%20perception.%20These%20processing%20differences%20may%20contribute%5Cnto%20several%20important%20aspects%20of%20development%20%28e.g.%2C%20joint%20attention%29%20and%5Cnmay%20play%20a%20vital%20role%20in%20the%20previously%20observed%20difficulties%20in%5Cnmother-infant%20interactions.%22%2C%22reportNumber%22%3A%22%22%2C%22reportType%22%3A%22preprint%22%2C%22institution%22%3A%22Preprints%22%2C%22date%22%3A%222023-5-20%22%2C%22language%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.authorea.com%5C%2Fusers%5C%2F620355%5C%2Farticles%5C%2F644425-effects-of-cleft-lip-on-visual-scanning-and-neural-processing-of-infant-faces%3Fcommit%3Dead248059df332acb6196118df1e169687955dbf%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-06-22T02%3A48%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22NSC67K4D%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Jiang%20et%20al.%22%2C%22parsedDate%22%3A%222023-04-19%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EJiang%2C%20Y.%2C%20Leiva%2C%20L.%20A.%2C%20Rezazadegan%20Tavakoli%2C%20H.%2C%20R.%20B.%20Houssel%2C%20P.%2C%20Kylm%26%23xE4%3Bl%26%23xE4%3B%2C%20J.%2C%20%26amp%3B%20Oulasvirta%2C%20A.%20%282023%29.%20UEyes%3A%20Understanding%20Visual%20Saliency%20across%20User%20Interface%20Types.%20%3Ci%3EProceedings%20of%20the%202023%20CHI%20Conference%20on%20Human%20Factors%20in%20Computing%20Systems%3C%5C%2Fi%3E%2C%201%26%23x2013%3B21.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3544548.3581096%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3544548.3581096%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22UEyes%3A%20Understanding%20Visual%20Saliency%20across%20User%20Interface%20Types%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yue%22%2C%22lastName%22%3A%22Jiang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Luis%20A.%22%2C%22lastName%22%3A%22Leiva%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hamed%22%2C%22lastName%22%3A%22Rezazadegan%20Tavakoli%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Paul%22%2C%22lastName%22%3A%22R.%20B.%20Houssel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Julia%22%2C%22lastName%22%3A%22Kylm%5Cu00e4l%5Cu00e4%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Antti%22%2C%22lastName%22%3A%22Oulasvirta%22%7D%5D%2C%22abstractNote%22%3A%22While%20user%20interfaces%20%28UIs%29%20display%20elements%20such%20as%20images%20and%20text%20in%20a%20grid-based%20layout%2C%20UI%20types%20differ%20significantly%20in%20the%20number%20of%20elements%20and%20how%20they%20are%20displayed.%20For%20example%2C%20webpage%20designs%20rely%20heavily%20on%20images%20and%20text%2C%20whereas%20desktop%20UIs%20tend%20to%20feature%20numerous%20small%20images.%20To%20examine%20how%20such%20differences%20affect%20the%20way%20users%20look%20at%20UIs%2C%20we%20collected%20and%20analyzed%20a%20large%20eye-tracking-based%20dataset%2C%20UEyes%20%2862%20participants%20and%201%2C980%20UI%20screenshots%29%2C%20covering%20four%20major%20UI%20types%3A%20webpage%2C%20desktop%20UI%2C%20mobile%20UI%2C%20and%20poster.%20We%20analyze%20its%20differences%20in%20biases%20related%20to%20such%20factors%20as%20color%2C%20location%2C%20and%20gaze%20direction.%20We%20also%20compare%20state-of-the-art%20predictive%20models%20and%20propose%20improvements%20for%20better%20capturing%20typical%20tendencies%20across%20UI%20types.%20Both%20the%20dataset%20and%20the%20models%20are%20publicly%20available.%22%2C%22date%22%3A%22April%2019%2C%202023%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%202023%20CHI%20Conference%20on%20Human%20Factors%20in%20Computing%20Systems%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1145%5C%2F3544548.3581096%22%2C%22ISBN%22%3A%22978-1-4503-9421-5%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fdoi%5C%2F10.1145%5C%2F3544548.3581096%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-23T20%3A51%3A46Z%22%7D%7D%2C%7B%22key%22%3A%22I3P49QLS%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Shamy%20and%20Feitelson%22%2C%22parsedDate%22%3A%222023-04-06%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EShamy%2C%20M.%2C%20%26amp%3B%20Feitelson%2C%20D.%20G.%20%282023%29.%20Identifying%20Lines%20and%20Interpreting%20Vertical%20Jumps%20in%20Eye%20Tracking%20Studies%20of%20Reading%20Text%20and%20Code.%20%3Ci%3EACM%20Transactions%20on%20Applied%20Perception%3C%5C%2Fi%3E%2C%20%3Ci%3E20%3C%5C%2Fi%3E%282%29%2C%206%3A1-6%3A20.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3579357%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3579357%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Identifying%20Lines%20and%20Interpreting%20Vertical%20Jumps%20in%20Eye%20Tracking%20Studies%20of%20Reading%20Text%20and%20Code%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mor%22%2C%22lastName%22%3A%22Shamy%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dror%20G.%22%2C%22lastName%22%3A%22Feitelson%22%7D%5D%2C%22abstractNote%22%3A%22Eye%20tracking%20studies%20have%20shown%20that%20reading%20code%2C%20in%20contradistinction%20to%20reading%20text%2C%20includes%20many%20vertical%20jumps.%20As%20different%20lines%20of%20code%20may%20have%20quite%20different%20functions%20%28e.g.%2C%20variable%20definition%2C%20flow%20control%2C%20or%20computation%29%2C%20it%20is%20important%20to%20accurately%20identify%20the%20lines%20being%20read.%20We%20design%20experiments%20that%20require%20a%20specific%20line%20of%20text%20to%20be%20scrutinized.%20Using%20the%20distribution%20of%20gazes%20around%20this%20line%2C%20we%20then%20calculate%20how%20the%20precision%20with%20which%20we%20can%20identify%20the%20line%20being%20read%20depends%20on%20the%20font%20size%20and%20spacing.%20The%20results%20indicate%20that%2C%20even%20after%20correcting%20for%20systematic%20bias%2C%20unnaturally%20large%20fonts%20and%20spacing%20may%20be%20required%20for%20reliable%20line%20identification.%20Interestingly%2C%20during%20the%20experiments%2C%20the%20participants%20also%20repeatedly%20re-checked%20their%20task%20and%20if%20they%20were%20looking%20at%20the%20correct%20line%2C%20leading%20to%20vertical%20jumps%20similar%20to%20those%20observed%20when%20reading%20code.%20This%20suggests%20that%20observed%20reading%20patterns%20may%20be%20%5Cu201cinefficient%2C%5Cu201d%20in%20the%20sense%20that%20participants%20feel%20the%20need%20to%20repeat%20actions%20beyond%20the%20minimal%20number%20apparently%20required%20for%20the%20task.%20This%20may%20have%20implications%20regarding%20the%20interpretation%20of%20reading%20patterns.%20In%20particular%2C%20reading%20does%20not%20reflect%20only%20the%20extraction%20of%20information%20from%20the%20text%20or%20code.%20Rather%2C%20reading%20patterns%20may%20also%20reflect%20other%20types%20of%20activities%2C%20such%20as%20getting%20a%20general%20orientation%2C%20and%20searching%20for%20specific%20locations%20in%20the%20context%20of%20performing%20a%20particular%20task.%22%2C%22date%22%3A%22April%206%2C%202023%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1145%5C%2F3579357%22%2C%22ISSN%22%3A%221544-3558%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fdoi%5C%2F10.1145%5C%2F3579357%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222024-01-23T01%3A27%3A31Z%22%7D%7D%2C%7B%22key%22%3A%228DSTSQC5%22%2C%22library%22%3A%7B%22id%22%3A3151148%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Lobodenko%20et%20al.%22%2C%22parsedDate%22%3A%222023-04%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ELobodenko%2C%20L.%2C%20Cheredniakova%2C%20A.%2C%20Shesterkina%2C%20L.%2C%20%26amp%3B%20Kharitonova%2C%20O.%20%282023%29.%20Eye-Tracking%20Technologies%20in%20the%20Analysis%20of%20Environmental%20Advertising%20and%20Journalistic%20Texts%20Perception%20by%20Youth.%20%3Ci%3E2023%20Communication%20Strategies%20in%20Digital%20Society%20Seminar%20%28ComSDS%29%3C%5C%2Fi%3E%2C%2078%26%23x2013%3B85.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FComSDS58064.2023.10130433%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FComSDS58064.2023.10130433%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Eye-Tracking%20Technologies%20in%20the%20Analysis%20of%20Environmental%20Advertising%20and%20Journalistic%20Texts%20Perception%20by%20Youth%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lidiya%22%2C%22lastName%22%3A%22Lobodenko%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anna%22%2C%22lastName%22%3A%22Cheredniakova%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Liudmila%22%2C%22lastName%22%3A%22Shesterkina%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Olga%22%2C%22lastName%22%3A%22Kharitonova%22%7D%5D%2C%22abstractNote%22%3A%22The%20research%20deals%20with%20the%20eye-tracking%20technologies%20used%20to%20carry%20out%20a%20comparative%20analysis%20of%20media%20effects%20created%20by%20environmental%20advertising%20and%20journalistic%20materials%20and%20their%20effect%20on%20young%20people.%20The%20study%20was%20aimed%20at%20determining%20the%20areas%20of%20interest%20of%20respondents%20in%20environmental%20advertising%20and%20journalistic%20texts%20using%20eye-tracking%20technology%20and%20quantitative%20data.%20The%20results%20of%20studying%20media%20effects%20are%20based%20on%20a%20cognitive%20approach.%20The%20respondents%20are%20young%20people%20aged%2018%20to%2022.%20The%20results%20of%20the%20study%20revealed%20patterns%20of%20attention%20of%20the%20youth%20audience%20to%20the%20verbal%20and%20non-verbal%20components%20of%20environmental%20polycode%20texts.%22%2C%22date%22%3A%222023-04%22%2C%22proceedingsTitle%22%3A%222023%20Communication%20Strategies%20in%20Digital%20Society%20Seminar%20%28ComSDS%29%22%2C%22conferenceName%22%3A%222023%20Communication%20Strategies%20in%20Digital%20Society%20Seminar%20%28ComSDS%29%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FComSDS58064.2023.10130433%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-06-22T02%3A35%3A43Z%22%7D%7D%5D%7D
Dondi, P., Sapuppo, S., & Porta, M. (2024). Leyenes: A gaze-based text entry method using linear smooth pursuit and target speed. International Journal of Human-Computer Studies, 184, 103204. https://doi.org/10.1016/j.ijhcs.2023.103204
Yin, R., & Neyens, D. M. (2024). Examining how information presentation methods and a chatbot impact the use and effectiveness of electronic health record patient portals: An exploratory study. Patient Education and Counseling, 119, 108055. https://doi.org/10.1016/j.pec.2023.108055
Cieśla, M., & Dzieńkowski, M. (2023). AN ANALYSIS OF THE IMPLEMENTATION OF ACCESSIBILITY TOOLS ON WEBSITES. Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska, 13(4), 51–56. https://doi.org/10.35784/iapgos.4459
Khairunnisa, G., & Sari, H. (2023). Eye Tracking-based Analysis of Customer Interest on The Effectiveness of Eco-friendly Product Advertising Content. Jurnal Optimasi Sistem Industri, 22, 153–164. https://doi.org/10.25077/josi.v22.n2.p153-164.2023
Chvátal, R., Slezáková, J., & Popelka, S. (2023). Analysis of problem-solving strategies for the development of geometric imagination using eye-tracking. Education and Information Technologies. https://doi.org/10.1007/s10639-023-12395-z
Zhang, C., Tian, C., Han, T., Li, H., Feng, Y., Chen, Y., Proctor, R. W., & Zhang, J. (2023). Evaluation of Infrastructure-based Warning System on Driving Behaviors – A Roundabout Study.
Sun, L., Zhang, M., Qiu, Y., & Zhang, C. (2023). Effects of Sleep Deprivation and Hazard Types on the Visual Search Patterns and Hazard Response Times of Taxi Drivers. Behavioral Sciences, 13(12), 1005. https://doi.org/10.3390/bs13121005
Mok, S., Park, S., & Whang, M. (2023). Examining the Impact of Digital Human Gaze Expressions on Engagement Induction. Biomimetics, 8(8), 610. https://doi.org/10.3390/biomimetics8080610
Chang, Y.-C., Gandi, N., Shin, K., Mun, Y.-J., Driggs-Campbell, K., & Kim, J. (2023). Specifying Target Objects in Robot Teleoperation Using Speech and Natural Eye Gaze. 2023 IEEE-RAS 22nd International Conference on Humanoid Robots (Humanoids), 1–7. https://doi.org/10.1109/Humanoids57100.2023.10375186
Kowalewski, S. J., & Williamson, B. (2023). Fostering Advocacy, Developing Empathetic UX Bricoleurs: Ongoing Programmatic Assessment and Responsive Curriculum Design. IEEE Transactions on Professional Communication, 66(4), 382–396. https://doi.org/10.1109/TPC.2023.3320530
Dang, A., & Nichols, B. S. (2023). The effects of size referents in user-generated photos on online review helpfulness. Journal of Consumer Behaviour, n/a(n/a). https://doi.org/10.1002/cb.2281
Kusumo, A. H. (2023). Has Website Design using Website Builder Fulfilled Usability Aspects? A Study Case of Three Website Builders. 545–557. https://doi.org/10.2991/978-94-6463-288-0_45
Lee, S., Byun, G., & Ha, M. (2023). Exploring the association between environmental factors and fear of crime in residential streets: an eye-tracking and questionnaire study. Journal of Asian Architecture and Building Engineering, 1–18. https://doi.org/10.1080/13467581.2023.2278449
Cybulski, P., Medyńska-Gulij, B., & Horbiński, T. (2023). Users’ Visual Experience During Temporal Navigation in Forecast Weather Maps on Mobile Devices. Journal of Geovisualization and Spatial Analysis, 7(2), 32. https://doi.org/10.1007/s41651-023-00160-2
Cui, Y., Liu, X., & Cheng, Y. (2023). Reader perception of and attitude to English-Chinese advertising posters: an eye tracking study. SN Social Sciences, 3(11), 192. https://doi.org/10.1007/s43545-023-00782-9
S Kumar, D., Sahadev, S., & Purani, K. (2023). Visual Aesthetic Quotient: Establishing the Effects of Computational Aesthetic Measures for Servicescape Design. Journal of Service Research, 10946705231205000. https://doi.org/10.1177/10946705231205000
Cheng, G., Zou, D., Xie, H., & Lee Wang, F. (2023). Exploring differences in self-regulated learning strategy use between high- and low-performing students in introductory programming: An analysis of eye-tracking and retrospective think-aloud data from program comprehension. Computers & Education, 104948. https://doi.org/10.1016/j.compedu.2023.104948
Segedinac, M., Savić, G., Zeljković, I., Slivka, J., & Konjović, Z. (2023). Assessing code readability in Python programming courses using eye-tracking. Computer Applications in Engineering Education, n/a(n/a). https://doi.org/10.1002/cae.22685
Novia, R., Titis, W., & Mirwan, U. (2023). An eye tracking study of customers’ visual attention to the fast-food chain’s page on instagram. AIP Conference Proceedings, 2510(1), 030042. https://doi.org/10.1063/5.0129351
Hwang, E., & Lee, J. (2023). Attention-based automatic editing of virtual lectures for reduced production labor and effective learning experience. International Journal of Human-Computer Studies, 103161. https://doi.org/10.1016/j.ijhcs.2023.103161
Biondi, F. N., Graf, F., Pillai, P., & Balasingam, B. (2023). On validating a generic camera-based blink detection system for cognitive load assessment. Cognitive Computation and Systems, n/a(n/a). https://doi.org/10.1049/ccs2.12088
Contemori, G., Oletto, C. M., Battaglini, L., Motterle, E., & Bertamini, M. (2023). Foveal feedback in perceptual processing: Contamination of neural representations and task difficulty effects. PLOS ONE, 18(10), e0291275. https://doi.org/10.1371/journal.pone.0291275
Kamal, M., Möbius, M., Bartella, A. K., & Lethaus, B. (2023). Perception of aesthetic features after surgical treatment of craniofacial malformations by observers of the same age: An eye-tracking study. Journal of Cranio-Maxillofacial Surgery. https://doi.org/10.1016/j.jcms.2023.09.009
Inoue, M., Nishiyama, M., & Iwai, Y. (2023). Age group identification using gaze-guided feature extraction. 2023 IEEE 12th Global Conference on Consumer Electronics (GCCE), 708–711. https://doi.org/10.1109/GCCE59613.2023.10315305
Sims, J. P., Haynes, A., & Lanius, C. (2023). Exploring the utility of eye tracking for sociological research on race. The British Journal of Sociology, n/a(n/a). https://doi.org/10.1111/1468-4446.13054
Aslan, M., Baykara, M., & Alakuş, T. B. (2023). LSTMNCP: lie detection from EEG signals with novel hybrid deep learning method. Multimedia Tools and Applications. https://doi.org/10.1007/s11042-023-16847-z
Yao, J., Su, S., & Liu, S. (2023). The effect of key audit matters reviewing on loan approval decisions? Finance Research Letters, 104467. https://doi.org/10.1016/j.frl.2023.104467
Furukado, R., & Hagiwara, G. (2023). Gaze and Electroencephalography (EEG) Parameters in Esports: Examinations Considering Genres and Skill Levels.
Liu, Y., Ghaiumy Anaraky, R., Aly, H., & Byrne, K. (2023). The Effect of Privacy Fatigue on Privacy Decision-Making Behavior. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 67(1), 2428–2433. https://doi.org/10.1177/21695067231193670
Biondi, F. N., Graf, F., Pillai, P., & Balasingam, B. (2023). On Validating a Generic Video-Based Blink Detection System for Cognitive Load Detection. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 67(1), 1425–1430. https://doi.org/10.1177/21695067231192924
Hatzithomas, L., Theodorakioglou, F., Margariti, K., & Boutsouki, C. (2023). Cross-media advertising strategies and brand attitude: the role of cognitive load. International Journal of Advertising, 0(0), 1–33. https://doi.org/10.1080/02650487.2023.2249342
Prahm, C., Konieczny, J., Bressler, M., Heinzel, J., Daigeler, A., Kolbenschlag, J., & Lauer, H. (2023). Influence of colored face masks on judgments of facial attractiveness and gaze patterns. Acta Psychologica, 239, 103994. https://doi.org/10.1016/j.actpsy.2023.103994
Jaśkowiec, M., & Kowalska-Chrzanowska, M. (2023). The Use of Games in Citizen Science Based on Findings from the EyeWire User Study. Games and Culture, 15554120231196260. https://doi.org/10.1177/15554120231196260
Rocca, F., Dave, M., Duvivier, V., Van Daele, A., Demeuse, M., Derobertmasure, A., Mancas, M., & Gosselin, B. (2023). Designing an Assistance Tool for Analyzing and Modeling Trainer Activity in Professional Training Through Simulation. Proceedings of the 2023 ACM International Conference on Interactive Media Experiences, 180–187. https://doi.org/10.1145/3573381.3596475
Huang, J., Raja, J., Cantor, C., Marx, W., Galgano, S., Zarzour, J., Caridi, T., Gunn, A., Morgan, D., & Smith, A. (2023). Eye Motion Tracking for Medical Image Interpretation Training. Current Problems in Diagnostic Radiology. https://doi.org/10.1067/j.cpradiol.2023.08.013
Brancucci, A., Ferracci, S., D’Anselmo, A., & Manippa, V. (2023). Hemispheric functional asymmetries and sex effects in visual bistable perception. Consciousness and Cognition, 113, 103551. https://doi.org/10.1016/j.concog.2023.103551
Lu, H.-Y., Lin, Y.-C., Chen, C.-H., Wang, C.-C., Han, I.-W., & Liang, W.-L. (2023). Detecting Children with Autism Spectrum Disorder Based on Eye-tracking and Machine Learning. 2023 IEEE 6th International Conference on Knowledge Innovation and Invention (ICKII), 372–375. https://doi.org/10.1109/ICKII58656.2023.10332630
Abeysinghe, Y., Mahanama, B., Jayawardena, G., Sunkara, M., Ashok, V., & Jayarathna, S. (2023). Gaze Analytics Dashboard for Distributed Eye Tracking. 2023 IEEE 24th International Conference on Information Reuse and Integration for Data Science (IRI), 140–145. https://doi.org/10.1109/IRI58017.2023.00031
Hong, W. C. H., Ngan, H. F. B., Yu, J., & Arbouw, P. (2023). Examining cultural differences in Airbnb naming convention and user reception: an eye-tracking study. Journal of Travel & Tourism Marketing, 40(6), 475–489. https://doi.org/10.1080/10548408.2023.2263764
(PDF) Eye Tracking as a Research and Training Tool for Ensuring Quality Education. (2023, July 23). ResearchGate. https://doi.org/10.1007/978-3-031-30498-9_28
Han, E. (2023). Comparing the Perception of In-Person and Digital Monitor Viewing of Paintings. Empirical Studies of the Arts, 41(2), 465–496. https://doi.org/10.1177/02762374231158520
Koutsogiorgi, C. C., & Michaelides, M. P. (2023). Response Tendencies to Positively and Negatively Worded Items of the Rosenberg Self-Esteem Scale With Eye-Tracking Methodology. European Journal of Psychological Assessment, 39(4), 307–315. https://doi.org/10.1027/1015-5759/a000772
Foroughi, C. K., Devlin, S., Pak, R., Brown, N. L., Sibley, C., & Coyne, J. T. (2023). Near-Perfect Automation: Investigating Performance, Trust, and Visual Attention Allocation. Human Factors, 65(4), 546–561. https://doi.org/10.1177/00187208211032889
Pillai, P., Balasingam, B., & Biondi, F. N. (2023). Model-Based Estimation of Mental Workload in Drivers Using Pupil Size Measurements. 2023 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), 815–821. https://doi.org/10.1109/AIM46323.2023.10196230
Warchol-Jakubowska, A., Krejtz, I., & Krejtz, K. (2023). An irrelevant look of novice tram driver: Visual attention distribution of novice and expert tram drivers. Proceedings of the 2023 Symposium on Eye Tracking Research and Applications, 1–3. https://doi.org/10.1145/3588015.3589514
Moreira, C., Alvito, D. M., Sousa, S. C., Nobre, I. M. G. B., Ouyang, C., Kopper, R., Duchowski, A., & Jorge, J. (2023). Comparing Visual Search Patterns in Chest X-Ray Diagnostics. Proceedings of the 2023 Symposium on Eye Tracking Research and Applications, 1–6. https://doi.org/10.1145/3588015.3588403
Hahn, A., Riedelsheimer, J., Royer, Z., Frederick, J., Kee, R., Crimmins, R., Huber, B., Harris, D., & Jantzen, K. (2023). Effects of Cleft Lip on Visual Scanning and Neural Processing of Infant Faces [Preprint]. Preprints. https://doi.org/10.22541/au.168455102.24287447/v1
Jiang, Y., Leiva, L. A., Rezazadegan Tavakoli, H., R. B. Houssel, P., Kylmälä, J., & Oulasvirta, A. (2023). UEyes: Understanding Visual Saliency across User Interface Types. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 1–21. https://doi.org/10.1145/3544548.3581096
Shamy, M., & Feitelson, D. G. (2023). Identifying Lines and Interpreting Vertical Jumps in Eye Tracking Studies of Reading Text and Code. ACM Transactions on Applied Perception, 20(2), 6:1-6:20. https://doi.org/10.1145/3579357
Lobodenko, L., Cheredniakova, A., Shesterkina, L., & Kharitonova, O. (2023). Eye-Tracking Technologies in the Analysis of Environmental Advertising and Journalistic Texts Perception by Youth. 2023 Communication Strategies in Digital Society Seminar (ComSDS), 78–85. https://doi.org/10.1109/ComSDS58064.2023.10130433