User Tools

Site Tools


research:internal_small_version

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
research:internal_small_version [2016/11/04 14:43] anthonyresearch:internal_small_version [2019/05/22 16:08] (current) – external edit 127.0.0.1
Line 1: Line 1:
 ====== Notes for design of the small screen version of the "Gigapixel memory" study====== ====== Notes for design of the small screen version of the "Gigapixel memory" study======
- 
-UPDATE: 2/29/16 
- 
-James & Agilay: working on tracking participant responses and programming loops. Still a work in progress. 
- 
-Stephanie: working on character movements within virtual environment. Significant progress made here! We can now make a character capable of crouching, however a little more work needs to be done specifying game physics (e.g. object and character physical boundaries). 
- 
-Priya & Dale: working on boundaries and game physics. Still a work in progress. 
- 
-WHAT'S HAPPENING NEXT: 
- 
-We came to the conclusion that it's time to start sourcing our work into one working draft. Anthony believes the solution is to use Git. Git will allow us to concatenate our Blender files with the goal of creating a version of the Blender game that we can use for the actual study. 
- 
-GOAL(S): 
- 
-Finishing coding before SONA pool closes. 
- 
-Figure out remaining game physics and looping issues. 
- 
-NOTE FROM JAMES: 
- 
-I definitely think the remaining issues can be figured out before the SONA pool closes. I anticipate having figured out all of the remaining kinks in the looping/respawning over spring break. I'll also work on the game physics. 
- 
-Special thanks to Stephanie for the demo today (everything looks really great and has a polished feel). Good job so far everyone! 
- 
- 
  
 (To add notes: log in, click on the pencil-looking icon to the right to edit, and then just start typing ...) (To add notes: log in, click on the pencil-looking icon to the right to edit, and then just start typing ...)
- 
-Notes from Fall 2015: 
- 
-The goal for this semester was to figure out how to track cursor movement on the small screen. To do this we have been learning how to use a software called Blender. On the other hand, Anthony has been trying to do the same using MATLAB. By the end of the semester, we hope to figure out soon which software to use for the final study.  
- 
-Exact notes on Blender as follows-  
- 
-Blender Tasks: 
- 
--Record Video of participant. 
-  
--Find tracking points to track zooming: Zooming can be tracked and quantified based on the screen recording inside of blender.  
- 
-   -To do: find out how to get coordinates from a ten second recording session  
- 
-   - Use script to get tracking coordinates (Uploaded on the lab google drive) 
- 
- 
-Agilay: 
- 
-Found http://sourceforge.net/projects/minimousemacro/ 
-This is a mouse macro Recorder.  
-Determined not to be ideal -- ruled out 
- 
- 
-**James** 
- 
-The seemingly random links I posted below reflect my attempt to make zoom/scaled distances on a screen congruent with actually moving up to an object in real-life. This approach is sort of a hybrid of the two different ways I was working on and is the most promising; the others would work, but individually they have pieces that can be successfully integrated to do what we want! I can explain more in a lab meeting/or in person. 
- 
-http://www.scantips.com/lights/subjectdistance.html 
- 
-http://www.elysiun.com/forum/showthread.php?291445-What-every-one-should-know-about-camera-data-for-the-tracker 
- 
-http://snapsnapsnap.photos/a-beginners-guide-for-manual-controls-in-iphone-photography-focus/ 
- 
-http://blender.stackexchange.com/questions/16131/lens-specs-of-smartphones-for-camera-tracking-in-blender 
- 
-There are two or three approaches to tracking cursor behavior (including panning/zooming) using blender. 
- 
-Way one: 
-*render video images in blender with shading 
-*use that shading to emulate 3d (panning/zooming) 
- 
-Way two: 
-*translate panning/zooming as 3d movement by having two tracking points, one for the cursor 
-and one for a reference frame. 
-*problem; figure out how to deal with movie image being cutoff (this stops the tracking and makes  
-mars reference frame tracking judgments made with blender...still figuring out the solution to this. 
-  
- 
-To Do List: 
- 
-get everyone up to speed on blender tracking. 
- 
-figure out if 3d shading render is a viable option for cursor tracking (panning/zooming too) 
- 
-be able to obtain coordinates from python text file. 
- 
-#8/10/16 
-Remaining Tasks: 
-*Clean-up aesthetic of VE 
-*Make Blender file an executable module. 
-*Run a pilot subject! 
-===== Exporting tracking data ===== 
- 
-Stephanie found this page with instructions on how to extract raw object tracking data on 11/16/15: 
- 
-http://blender.stackexchange.com/questions/23218/getting-raw-data-from-object-track 
- 
-There is also an option in Blender under File -> Export -> COLLADA (.dae) that will create a COLLADA file. Wikipedia describes this format as a 3D interchange file format, and says python has a module called pycollada that will read this data. The link above suggests you can use python to export this data into .csv if needed. 
- 
- 
-\\ 
- 
  
 ===== 2016-02-10 Blender files directory in VNLab folder ===== ===== 2016-02-10 Blender files directory in VNLab folder =====
Line 115: Line 15:
 ====== End of Spring 2015 semester notes ====== ====== End of Spring 2015 semester notes ======
  
-[taken by Daijreous Poole (?)]+[taken by __**Daijreous Poole**__ (?)]
  
 ===== Brainstorming ===== ===== Brainstorming =====
Line 131: Line 31:
 ====== 2015.9.28 lab meeting ====== ====== 2015.9.28 lab meeting ======
  
-===== anthony's notes before the meeting =====+===== AC's notes before the meeting =====
  
   * Specify display and PC to use   * Specify display and PC to use
Line 147: Line 47:
 ==== Comparison with large display version ==== ==== Comparison with large display version ====
  
-  * Stephanie: What's the focus: equating the Gigapixel display context, or difference between user interfaces?+  * ** Stephanie**: What's the focus: equating the Gigapixel display context, or difference between user interfaces?
  
-  * James: loss of proprioceptive cues for item location unavoidable+  * **James**: loss of proprioceptive cues for item location unavoidable
  
   * Draw bezels in picture.   * Draw bezels in picture.
Line 157: Line 57:
   * Pan out farther than the whole image, to include view of feet of the GP monitor?   * Pan out farther than the whole image, to include view of feet of the GP monitor?
  
-  * Anthony: give up on drawing cues from outside of frame of image.+  * AC: give up on drawing cues from outside of frame of image.
  
 ==== UI (User Interface) ==== ==== UI (User Interface) ====
Line 178: Line 78:
 ====== 2015.10.12 lab meeting notes ====== ====== 2015.10.12 lab meeting notes ======
  
-[by Anthony]+[by AC]
  
 ===== Software for tracking interaction with display ===== ===== Software for tracking interaction with display =====
Line 203: Line 103:
 http://linux.die.net/man/1/xev http://linux.die.net/man/1/xev
  
-=== Notes on software ===+====== 2015.10.24 Notes ====== 
 + 
 + 
 +===== Notes on software ===== 
 + 
 + 
 +__**SR**__ (2015.10.24): Downloaded and tried MouseTracker (http://www.mousetracker.org/)
  
-SR (2015.10.24): Downloaded and tried MouseTracker (http://www.mousetracker.org/) 
   * Very simple display, maybe too simplistic for our needs.    * Very simple display, maybe too simplistic for our needs. 
   * Not sure how pan/zoom could be applied through native GUI.    * Not sure how pan/zoom could be applied through native GUI. 
Line 221: Line 126:
 ===== Pseudo-code for a possible pan/zoom interface ===== ===== Pseudo-code for a possible pan/zoom interface =====
  
-[Anthony's notes ahead of meeting:]+[AC's notes ahead of meeting:]
  
 <code> <code>
Line 269: Line 174:
 ====== Notes about matlab tools for interactive UI ====== ====== Notes about matlab tools for interactive UI ======
  
-By anthony:+By AC:
  
 ===== Overview panel for zoomed-in images ===== ===== Overview panel for zoomed-in images =====
Line 290: Line 195:
 ===== Matlab efforts ===== ===== Matlab efforts =====
  
-anthony made a new directory on uniqua: ''/home/anthony/Documents/MATLAB/LLID/lsnd/''+AC made a new directory on uniqua: ''/home/anthony/Documents/MATLAB/LLID/lsnd/''
  
 "lsnd" stands for "Learning in Small Non-interactive Displays," which is meant to mean the opposite of the abbreviation of the overall study (LLID = Learning in Large Interactive Displays). "lsnd" stands for "Learning in Small Non-interactive Displays," which is meant to mean the opposite of the abbreviation of the overall study (LLID = Learning in Large Interactive Displays).
Line 337: Line 242:
 \\ \\
 ---- ----
 +
 +====== Notes from 2016-02-29 ======
 +
 +\\
 +
 +===== Notes from Fall 2015: =====
 +
 +
 +The goal for this semester was to figure out how to track cursor movement on the small screen. To do this we have been learning how to use a software called Blender. On the other hand, Anthony has been trying to do the same using MATLAB. By the end of the semester, we hope to figure out soon which software to use for the final study. 
 +
 +Exact notes on Blender as follows- 
 +
 +==== Blender Tasks: ====
 +
 +
 +-Record Video of participant.
 + 
 +-Find tracking points to track zooming: Zooming can be tracked and quantified based on the screen recording inside of blender. 
 +
 +   -To do: find out how to get coordinates from a ten second recording session 
 +
 +   - Use script to get tracking coordinates (Uploaded on the lab google drive)
 +
 +\\
 +----
 +
 +__**James & Agilay**__: working on tracking participant responses and programming loops. Still a work in progress.
 +
 +__**Stephanie**__: working on character movements within virtual environment. Significant progress made here! We can now make a character capable of crouching, however a little more work needs to be done specifying game physics (e.g. object and character physical boundaries).
 +
 +__**Priya & Dale**__: working on boundaries and game physics. Still a work in progress.
 +
 +
 +\\
 +
 +==== WHAT'S HAPPENING NEXT: ====
 +
 +We came to the conclusion that it's time to start sourcing our work into one working draft. Anthony believes the solution is to use Git. Git will allow us to concatenate our Blender files with the goal of creating a version of the Blender game that we can use for the actual study.
 +
 +=== GOAL(S): ===
 +
 +Finishing coding before SONA pool closes.
 +
 +Figure out remaining game physics and looping issues.
 +
 +NOTE FROM __**JAMES**__:
 +
 +I definitely think the remaining issues can be figured out before the SONA pool closes. I anticipate having figured out all of the remaining kinks in the looping/respawning over spring break. I'll also work on the game physics.
 +
 +Special thanks to __**Stephanie**__ for the demo today (everything looks really great and has a polished feel). Good job so far everyone!
 +
 +\\
 +
 +__**Agilay**__:
 +
 +Found http://sourceforge.net/projects/minimousemacro/
 +This is a mouse macro Recorder. 
 +Determined not to be ideal -- ruled out
 +
 +__**James**__
 +
 +The seemingly random links I posted below reflect my attempt to make zoom/scaled distances on a screen congruent with actually moving up to an object in real-life. This approach is sort of a hybrid of the two different ways I was working on and is the most promising; the others would work, but individually they have pieces that can be successfully integrated to do what we want! I can explain more in a lab meeting/or in person.
 +
 +http://www.scantips.com/lights/subjectdistance.html
 +
 +http://www.elysiun.com/forum/showthread.php?291445-What-every-one-should-know-about-camera-data-for-the-tracker
 +
 +http://snapsnapsnap.photos/a-beginners-guide-for-manual-controls-in-iphone-photography-focus/
 +
 +http://blender.stackexchange.com/questions/16131/lens-specs-of-smartphones-for-camera-tracking-in-blender
 +
 +There are two or three approaches to tracking cursor behavior (including panning/zooming) using blender.
 +
 +== Way one: ==
 +
 +  * render video images in blender with shading
 +  * use that shading to emulate 3d (panning/zooming)
 +
 +
 +== Way two: ==
 +
 +  * translate panning/zooming as 3d movement by having two tracking points, one for the cursor and one for a reference frame.
 +  * problem; figure out how to deal with movie image being cutoff (this stops the tracking and makes mars reference frame tracking judgments made with blender...still figuring out the solution to this).
 + 
 +
 +=== To Do List: ===
 +
 +  * get everyone up to speed on blender tracking.
 +  * figure out if 3d shading render is a viable option for cursor tracking (panning/zooming too)
 +  * be able to obtain coordinates from python text file.
 +
 +\\
 +
 +====== Notes from 2016.10.8 ======
 +
 +
 +===== Remaining Tasks: =====
 +
 +  * Clean-up aesthetic of VE
 +  * Make Blender file an executable module.
 +  * Run a pilot subject!
 +
 +
 +===== Exporting tracking data =====
 +
 +__**Stephanie**__ found this page with instructions on how to extract raw object tracking data on 11/16/15:
 +
 +http://blender.stackexchange.com/questions/23218/getting-raw-data-from-object-track
 +
 +There is also an option in Blender under File -> Export -> COLLADA (.dae) that will create a COLLADA file. Wikipedia describes this format as a 3D interchange file format, and says python has a module called pycollada that will read this data. The link above suggests you can use python to export this data into .csv if needed.
 +
 +\\
 +
 +===== 02/15/2017 Lab meeting: Re-introduction to study and Task List =====
 +
 +1. Aesthetics: Presence and characteristics of floor, ceiling, walls, other computers, etc.
 +
 +2. Make Blender file an executable module.
 +
 +3. Putting words on the screen (one image with words and one image with just boxes)
 +
 +4. Generating and exporting the data file
 +
 +5. Making poison-boxes for all the words
 +
 +6. Blender tracking (Stephanie and James posted links about how to do this above)
 +
 +7. Writing IRB
 +
 +8. Assigning experimenter roles to everyone
 +
 +
 ====== TODO list ====== ====== TODO list ======
  
Line 354: Line 391:
  
 ~~TODOLIST showdate:yes  ns:research:internal_small_version~~ ~~TODOLIST showdate:yes  ns:research:internal_small_version~~
- 
  
research/internal_small_version.1478285008.txt.gz · Last modified: 2019/05/22 16:08 (external edit)