research:internal_small_version
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
research:internal_small_version [2016/05/19 13:03] – [Matlab efforts] anthony | research:internal_small_version [2019/05/22 16:08] (current) – external edit 127.0.0.1 | ||
---|---|---|---|
Line 1: | Line 1: | ||
====== Notes for design of the small screen version of the " | ====== Notes for design of the small screen version of the " | ||
- | |||
- | UPDATE: 2/29/16 | ||
- | |||
- | James & Agilay: working on tracking participant responses and programming loops. Still a work in progress. | ||
- | |||
- | Stephanie: working on character movements within virtual environment. Significant progress made here! We can now make a character capable of crouching, however a little more work needs to be done specifying game physics (e.g. object and character physical boundaries). | ||
- | |||
- | Priya & Dale: working on boundaries and game physics. Still a work in progress. | ||
- | |||
- | WHAT'S HAPPENING NEXT: | ||
- | |||
- | We came to the conclusion that it's time to start sourcing our work into one working draft. Anthony believes the solution is to use Git. Git will allow us to concatenate our Blender files with the goal of creating a version of the Blender game that we can use for the actual study. | ||
- | |||
- | GOAL(S): | ||
- | |||
- | Finishing coding before SONA pool closes. | ||
- | |||
- | Figure out remaining game physics and looping issues. | ||
- | |||
- | NOTE FROM JAMES: | ||
- | |||
- | I definitely think the remaining issues can be figured out before the SONA pool closes. I anticipate having figured out all of the remaining kinks in the looping/ | ||
- | |||
- | Special thanks to Stephanie for the demo today (everything looks really great and has a polished feel). Good job so far everyone! | ||
- | |||
- | |||
(To add notes: log in, click on the pencil-looking icon to the right to edit, and then just start typing ...) | (To add notes: log in, click on the pencil-looking icon to the right to edit, and then just start typing ...) | ||
- | |||
- | Notes from Fall 2015: | ||
- | |||
- | The goal for this semester was to figure out how to track cursor movement on the small screen. To do this we have been learning how to use a software called Blender. On the other hand, Anthony has been trying to do the same using MATLAB. By the end of the semester, we hope to figure out soon which software to use for the final study. | ||
- | |||
- | Exact notes on Blender as follows- | ||
- | |||
- | Blender Tasks: | ||
- | |||
- | -Record Video of participant. | ||
- | |||
- | -Find tracking points to track zooming: Zooming can be tracked and quantified based on the screen recording inside of blender. | ||
- | |||
- | -To do: find out how to get coordinates from a ten second recording session | ||
- | |||
- | - Use script to get tracking coordinates (Uploaded on the lab google drive) | ||
- | |||
- | |||
- | Agilay: | ||
- | |||
- | Found http:// | ||
- | This is a mouse macro Recorder. | ||
- | Determined not to be ideal -- ruled out | ||
- | |||
- | |||
- | **James** | ||
- | |||
- | The seemingly random links I posted below reflect my attempt to make zoom/scaled distances on a screen congruent with actually moving up to an object in real-life. This approach is sort of a hybrid of the two different ways I was working on and is the most promising; the others would work, but individually they have pieces that can be successfully integrated to do what we want! I can explain more in a lab meeting/or in person. | ||
- | |||
- | http:// | ||
- | |||
- | http:// | ||
- | |||
- | http:// | ||
- | |||
- | http:// | ||
- | |||
- | There are two or three approaches to tracking cursor behavior (including panning/ | ||
- | |||
- | Way one: | ||
- | *render video images in blender with shading | ||
- | *use that shading to emulate 3d (panning/ | ||
- | |||
- | Way two: | ||
- | *translate panning/ | ||
- | and one for a reference frame. | ||
- | *problem; figure out how to deal with movie image being cutoff (this stops the tracking and makes | ||
- | mars reference frame tracking judgments made with blender...still figuring out the solution to this. | ||
- | |||
- | |||
- | To Do List: | ||
- | |||
- | get everyone up to speed on blender tracking. | ||
- | |||
- | figure out if 3d shading render is a viable option for cursor tracking (panning/ | ||
- | |||
- | be able to obtain coordinates from python text file. | ||
- | |||
- | ===== Exporting tracking data ===== | ||
- | |||
- | Stephanie found this page with instructions on how to extract raw object tracking data on 11/16/15: | ||
- | |||
- | http:// | ||
- | |||
- | There is also an option in Blender under File -> Export -> COLLADA (.dae) that will create a COLLADA file. Wikipedia describes this format as a 3D interchange file format, and says python has a module called pycollada that will read this data. The link above suggests you can use python to export this data into .csv if needed. | ||
- | |||
- | |||
- | \\ | ||
- | |||
===== 2016-02-10 Blender files directory in VNLab folder ===== | ===== 2016-02-10 Blender files directory in VNLab folder ===== | ||
Line 103: | Line 8: | ||
'' | '' | ||
- | |||
\\ | \\ | ||
Line 111: | Line 15: | ||
====== End of Spring 2015 semester notes ====== | ====== End of Spring 2015 semester notes ====== | ||
- | [taken by Daijreous Poole (?)] | + | [taken by __**Daijreous Poole**__ (?)] |
===== Brainstorming ===== | ===== Brainstorming ===== | ||
Line 123: | Line 27: | ||
\\ | \\ | ||
- | ====== 2015.10.6 lab meeting goals: set up first set of tasks ====== | ||
- | <todo @lab # | ||
====== 2015.9.28 lab meeting ====== | ====== 2015.9.28 lab meeting ====== | ||
- | ===== anthony's notes before the meeting ===== | + | ===== AC's notes before the meeting ===== |
* Specify display and PC to use | * Specify display and PC to use | ||
Line 145: | Line 47: | ||
==== Comparison with large display version ==== | ==== Comparison with large display version ==== | ||
- | * Stephanie: What's the focus: equating the Gigapixel display context, or difference between user interfaces? | + | |
- | * James: loss of proprioceptive cues for item location unavoidable | + | |
* Draw bezels in picture. | * Draw bezels in picture. | ||
Line 155: | Line 57: | ||
* Pan out farther than the whole image, to include view of feet of the GP monitor? | * Pan out farther than the whole image, to include view of feet of the GP monitor? | ||
- | * Anthony: give up on drawing cues from outside of frame of image. | + | * AC: give up on drawing cues from outside of frame of image. |
==== UI (User Interface) ==== | ==== UI (User Interface) ==== | ||
Line 172: | Line 74: | ||
---- | ---- | ||
+ | <todo @lab # | ||
====== 2015.10.12 lab meeting notes ====== | ====== 2015.10.12 lab meeting notes ====== | ||
- | [by Anthony] | + | [by AC] |
===== Software for tracking interaction with display ===== | ===== Software for tracking interaction with display ===== | ||
Line 200: | Line 103: | ||
http:// | http:// | ||
- | === Notes on software === | + | ====== 2015.10.24 Notes ====== |
+ | |||
+ | |||
+ | ===== Notes on software ===== | ||
+ | |||
+ | |||
+ | __**SR**__ (2015.10.24): | ||
- | SR (2015.10.24): | ||
* Very simple display, maybe too simplistic for our needs. | * Very simple display, maybe too simplistic for our needs. | ||
* Not sure how pan/zoom could be applied through native GUI. | * Not sure how pan/zoom could be applied through native GUI. | ||
Line 218: | Line 126: | ||
===== Pseudo-code for a possible pan/zoom interface ===== | ===== Pseudo-code for a possible pan/zoom interface ===== | ||
- | [Anthony's notes ahead of meeting:] | + | [AC's notes ahead of meeting:] |
< | < | ||
Line 266: | Line 174: | ||
====== Notes about matlab tools for interactive UI ====== | ====== Notes about matlab tools for interactive UI ====== | ||
- | By anthony: | + | By AC: |
===== Overview panel for zoomed-in images ===== | ===== Overview panel for zoomed-in images ===== | ||
Line 287: | Line 195: | ||
===== Matlab efforts ===== | ===== Matlab efforts ===== | ||
- | anthony | + | AC made a new directory on uniqua: ''/ |
" | " | ||
Line 334: | Line 242: | ||
\\ | \\ | ||
---- | ---- | ||
+ | |||
+ | ====== Notes from 2016-02-29 ====== | ||
+ | |||
+ | \\ | ||
+ | |||
+ | ===== Notes from Fall 2015: ===== | ||
+ | |||
+ | |||
+ | The goal for this semester was to figure out how to track cursor movement on the small screen. To do this we have been learning how to use a software called Blender. On the other hand, Anthony has been trying to do the same using MATLAB. By the end of the semester, we hope to figure out soon which software to use for the final study. | ||
+ | |||
+ | Exact notes on Blender as follows- | ||
+ | |||
+ | ==== Blender Tasks: ==== | ||
+ | |||
+ | |||
+ | -Record Video of participant. | ||
+ | |||
+ | -Find tracking points to track zooming: Zooming can be tracked and quantified based on the screen recording inside of blender. | ||
+ | |||
+ | -To do: find out how to get coordinates from a ten second recording session | ||
+ | |||
+ | - Use script to get tracking coordinates (Uploaded on the lab google drive) | ||
+ | |||
+ | \\ | ||
+ | ---- | ||
+ | |||
+ | __**James & Agilay**__: working on tracking participant responses and programming loops. Still a work in progress. | ||
+ | |||
+ | __**Stephanie**__: | ||
+ | |||
+ | __**Priya & Dale**__: working on boundaries and game physics. Still a work in progress. | ||
+ | |||
+ | |||
+ | \\ | ||
+ | |||
+ | ==== WHAT'S HAPPENING NEXT: ==== | ||
+ | |||
+ | We came to the conclusion that it's time to start sourcing our work into one working draft. Anthony believes the solution is to use Git. Git will allow us to concatenate our Blender files with the goal of creating a version of the Blender game that we can use for the actual study. | ||
+ | |||
+ | === GOAL(S): === | ||
+ | |||
+ | Finishing coding before SONA pool closes. | ||
+ | |||
+ | Figure out remaining game physics and looping issues. | ||
+ | |||
+ | NOTE FROM __**JAMES**__: | ||
+ | |||
+ | I definitely think the remaining issues can be figured out before the SONA pool closes. I anticipate having figured out all of the remaining kinks in the looping/ | ||
+ | |||
+ | Special thanks to __**Stephanie**__ for the demo today (everything looks really great and has a polished feel). Good job so far everyone! | ||
+ | |||
+ | \\ | ||
+ | |||
+ | __**Agilay**__: | ||
+ | |||
+ | Found http:// | ||
+ | This is a mouse macro Recorder. | ||
+ | Determined not to be ideal -- ruled out | ||
+ | |||
+ | __**James**__ | ||
+ | |||
+ | The seemingly random links I posted below reflect my attempt to make zoom/scaled distances on a screen congruent with actually moving up to an object in real-life. This approach is sort of a hybrid of the two different ways I was working on and is the most promising; the others would work, but individually they have pieces that can be successfully integrated to do what we want! I can explain more in a lab meeting/or in person. | ||
+ | |||
+ | http:// | ||
+ | |||
+ | http:// | ||
+ | |||
+ | http:// | ||
+ | |||
+ | http:// | ||
+ | |||
+ | There are two or three approaches to tracking cursor behavior (including panning/ | ||
+ | |||
+ | == Way one: == | ||
+ | |||
+ | * render video images in blender with shading | ||
+ | * use that shading to emulate 3d (panning/ | ||
+ | |||
+ | |||
+ | == Way two: == | ||
+ | |||
+ | * translate panning/ | ||
+ | * problem; figure out how to deal with movie image being cutoff (this stops the tracking and makes mars reference frame tracking judgments made with blender...still figuring out the solution to this). | ||
+ | |||
+ | |||
+ | === To Do List: === | ||
+ | |||
+ | * get everyone up to speed on blender tracking. | ||
+ | * figure out if 3d shading render is a viable option for cursor tracking (panning/ | ||
+ | * be able to obtain coordinates from python text file. | ||
+ | |||
+ | \\ | ||
+ | |||
+ | ====== Notes from 2016.10.8 ====== | ||
+ | |||
+ | |||
+ | ===== Remaining Tasks: ===== | ||
+ | |||
+ | * Clean-up aesthetic of VE | ||
+ | * Make Blender file an executable module. | ||
+ | * Run a pilot subject! | ||
+ | |||
+ | |||
+ | ===== Exporting tracking data ===== | ||
+ | |||
+ | __**Stephanie**__ found this page with instructions on how to extract raw object tracking data on 11/16/15: | ||
+ | |||
+ | http:// | ||
+ | |||
+ | There is also an option in Blender under File -> Export -> COLLADA (.dae) that will create a COLLADA file. Wikipedia describes this format as a 3D interchange file format, and says python has a module called pycollada that will read this data. The link above suggests you can use python to export this data into .csv if needed. | ||
+ | |||
+ | \\ | ||
+ | |||
+ | ===== 02/15/2017 Lab meeting: Re-introduction to study and Task List ===== | ||
+ | |||
+ | 1. Aesthetics: Presence and characteristics of floor, ceiling, walls, other computers, etc. | ||
+ | |||
+ | 2. Make Blender file an executable module. | ||
+ | |||
+ | 3. Putting words on the screen (one image with words and one image with just boxes) | ||
+ | |||
+ | 4. Generating and exporting the data file | ||
+ | |||
+ | 5. Making poison-boxes for all the words | ||
+ | |||
+ | 6. Blender tracking (Stephanie and James posted links about how to do this above) | ||
+ | |||
+ | 7. Writing IRB | ||
+ | |||
+ | 8. Assigning experimenter roles to everyone | ||
+ | |||
+ | |||
====== TODO list ====== | ====== TODO list ====== | ||
Line 351: | Line 391: | ||
~~TODOLIST showdate: | ~~TODOLIST showdate: | ||
- | |||
research/internal_small_version.1463677424.txt.gz · Last modified: 2019/05/22 16:08 (external edit)