Table of Contents
Notes for design of the small screen version of the "Gigapixel memory" study
(To add notes: log in, click on the pencil-looking icon to the right to edit, and then just start typing …)
2016-02-10 Blender files directory in VNLab folder
James had made a folder …/VNLab/Blender Tracking/
to hold useful Blender files. Today Anthony changed the folder's name to blender
and moved it to a new location:
…VNLab\studies\LLID\code\blender
End of Spring 2015 semester notes
[taken by Daijreous Poole (?)]
Brainstorming
- remind Dr. Cate that we are trying to do same thing and see whats going on NOT to do something new.
- OBS open broadcasting system can track mouse for the small screen study (Free software)
- check to see if touchscreen function works
- use a wii controller thing to zoom pan etc
- keep time per trial be similar in both studies (as well as breaks)
- Dr Cate writes tasks and we can sign up for them on the wiki.
2015.9.28 lab meeting
AC's notes before the meeting
- Specify display and PC to use
- Need to install Ubuntu operating system on PC
- Then set up system for panning/zooming and recording mouse/display state
- Automate reading of words this time?
- Participants need to stand (but probably not move around)
Notes AC took during the meeting:
[Headings represent AC's attempt to organize comments into themes.]
Comparison with large display version
- Stephanie: What's the focus: equating the Gigapixel display context, or difference between user interfaces?
- James: loss of proprioceptive cues for item location unavoidable
- Draw bezels in picture.
- Do study in dark room.
- Pan out farther than the whole image, to include view of feet of the GP monitor?
- AC: give up on drawing cues from outside of frame of image.
UI (User Interface)
- How to measure errors? – Need to take mouse click to end trial.
- Scrolling: easy to lose track. Way to keep track?
- Thumbnail map in corner of display?
- Other method besides thumbnail map of image? Some cues to indicate position that are like image of legs for down, etc.
- Controls: game style controller? Controller with two controls, one for moving, one for cursor location.
[lab]Specify first set of tasks with target dates.
2015.10.12 lab meeting notes
[by AC]
Software for tracking interaction with display
Candidate software websites
Use MATLAB figure interface functions
Links to resources
Use Linux X-Window funcitons
Links to resources
http://www.x.org/archive/X11R7.6/doc/man/man1/xev.1.xhtml
https://unix.stackexchange.com/questions/129159/record-every-keystroke-and-store-in-a-file
2015.10.24 Notes
Notes on software
SR (2015.10.24): Downloaded and tried MouseTracker (http://www.mousetracker.org/)
- Very simple display, maybe too simplistic for our needs.
- Not sure how pan/zoom could be applied through native GUI.
- Program can interface with Matlab, but still requires weird .csv excel file setup that would take some effort to figure out (in actuality it's probably a straightforward setup but it's so foreign that I predict a steep learning curve).
- Because of this, I don't even know where to start with trying a demo. First impressions make me not want to continue.
2015.10.12 lab meeting notes
Pseudo-code for a possible pan/zoom interface
[AC's notes ahead of meeting:]
Define keyboard functions: up/down/L/R arrows control panning mouse wheel controls zooming [should be default:] mouse x/y controls cursor location Begin trial loop Display entire image (maximally zoomed out). Overlay thumbnail of entire image in small pane. Move cursor to center of display. Begin display update loop (infinite) Check keyboard state Change both images as needed Check mouse state Change cursor location as needed if maximally zoomed in Allow mouse click Check mouse button state if mouse button down record x/y location continue to next trial end end save all keyboard, mouse, button states end display update loop end trial loop
Notes about matlab tools for interactive UI
By AC:
Overview panel for zoomed-in images
The base distribution of MATLAB includes functions for automatically generating a second panel that shows an overview with a rectangle that shows the visible part of the main image:
- imscrollpanel.m
- imoverview.m
- imoverviewpanel.m
- pan.m
http://www.mathworks.com/help/images/creating-the-modular-tools.html#f17-59762
2015.11.9
Matlab efforts
AC made a new directory on uniqua: /home/anthony/Documents/MATLAB/LLID/lsnd/
“lsnd” stands for “Learning in Small Non-interactive Displays,” which is meant to mean the opposite of the abbreviation of the overall study (LLID = Learning in Large Interactive Displays).
Made 3 matlab scripts by copying code from websites:
my_large_image_display.m
From http://www.mathworks.com/help/images/creating-the-modular-tools.html#f17-59762
This is a function that sets up an image overview pane.
scroll_display_TEST.m
From: http://www.mathworks.com/help/images/creating-the-modular-tools.html#f17-59762
This is a script that calls the above function and displays an image.
scroll_wheel.m
From: http://www.mathworks.com/help/matlab/ref/figure-properties.html#WindowScrollWheelFcn
This is a function/script that demonstrates how to assign a function to the scroll wheel. The function is a pretty useless one for our purposes: it resets the scale of the x-axis.
OLD TODO: Integrate scroll_wheel.m function into scroll_display_TEST.m script, and change scroll wheel function to do zooming in and out.
Other links that might help for that:
http://www.mathworks.com/help/images/creating-the-modular-tools.html#f17-59762
http://www.mathworks.com/matlabcentral/fileexchange/28998-navigate-m/content/navigate.m
http://www.mathworks.com/help/images/ref/imscrollpanel.html
http://www.mathworks.com/matlabcentral/fileexchange/25666-mouse-friendly-figure
http://www.mathworks.com/help/matlab/ref/zoom.html
http://www.mathworks.com/matlabcentral/answers/8790-reading-arrow-key-input
http://www.mathworks.com/matlabcentral/fileexchange/3090-zoom-keys
http://www.mathworks.com/matlabcentral/fileexchange/22334-keyboardnavigate
http://undocumentedmatlab.com/blog/uicontrol-callbacks
Notes from 2016-02-29
Notes from Fall 2015:
The goal for this semester was to figure out how to track cursor movement on the small screen. To do this we have been learning how to use a software called Blender. On the other hand, Anthony has been trying to do the same using MATLAB. By the end of the semester, we hope to figure out soon which software to use for the final study.
Exact notes on Blender as follows-
Blender Tasks:
-Record Video of participant.
-Find tracking points to track zooming: Zooming can be tracked and quantified based on the screen recording inside of blender.
- To do: find out how to get coordinates from a ten second recording session
- Use script to get tracking coordinates (Uploaded on the lab google drive)
James & Agilay: working on tracking participant responses and programming loops. Still a work in progress.
Stephanie: working on character movements within virtual environment. Significant progress made here! We can now make a character capable of crouching, however a little more work needs to be done specifying game physics (e.g. object and character physical boundaries).
Priya & Dale: working on boundaries and game physics. Still a work in progress.
WHAT'S HAPPENING NEXT:
We came to the conclusion that it's time to start sourcing our work into one working draft. Anthony believes the solution is to use Git. Git will allow us to concatenate our Blender files with the goal of creating a version of the Blender game that we can use for the actual study.
GOAL(S):
Finishing coding before SONA pool closes.
Figure out remaining game physics and looping issues.
NOTE FROM JAMES:
I definitely think the remaining issues can be figured out before the SONA pool closes. I anticipate having figured out all of the remaining kinks in the looping/respawning over spring break. I'll also work on the game physics.
Special thanks to Stephanie for the demo today (everything looks really great and has a polished feel). Good job so far everyone!
Agilay:
Found http://sourceforge.net/projects/minimousemacro/ This is a mouse macro Recorder. Determined not to be ideal – ruled out
James
The seemingly random links I posted below reflect my attempt to make zoom/scaled distances on a screen congruent with actually moving up to an object in real-life. This approach is sort of a hybrid of the two different ways I was working on and is the most promising; the others would work, but individually they have pieces that can be successfully integrated to do what we want! I can explain more in a lab meeting/or in person.
http://www.scantips.com/lights/subjectdistance.html
http://snapsnapsnap.photos/a-beginners-guide-for-manual-controls-in-iphone-photography-focus/
There are two or three approaches to tracking cursor behavior (including panning/zooming) using blender.
Way one:
- render video images in blender with shading
- use that shading to emulate 3d (panning/zooming)
Way two:
- translate panning/zooming as 3d movement by having two tracking points, one for the cursor and one for a reference frame.
- problem; figure out how to deal with movie image being cutoff (this stops the tracking and makes mars reference frame tracking judgments made with blender…still figuring out the solution to this).
To Do List:
- get everyone up to speed on blender tracking.
- figure out if 3d shading render is a viable option for cursor tracking (panning/zooming too)
- be able to obtain coordinates from python text file.
Notes from 2016.10.8
Remaining Tasks:
- Clean-up aesthetic of VE
- Make Blender file an executable module.
- Run a pilot subject!
Exporting tracking data
Stephanie found this page with instructions on how to extract raw object tracking data on 11/16/15:
http://blender.stackexchange.com/questions/23218/getting-raw-data-from-object-track
There is also an option in Blender under File → Export → COLLADA (.dae) that will create a COLLADA file. Wikipedia describes this format as a 3D interchange file format, and says python has a module called pycollada that will read this data. The link above suggests you can use python to export this data into .csv if needed.
02/15/2017 Lab meeting: Re-introduction to study and Task List
1. Aesthetics: Presence and characteristics of floor, ceiling, walls, other computers, etc.
2. Make Blender file an executable module.
3. Putting words on the screen (one image with words and one image with just boxes)
4. Generating and exporting the data file
5. Making poison-boxes for all the words
6. Blender tracking (Stephanie and James posted links about how to do this above)
7. Writing IRB
8. Assigning experimenter roles to everyone
TODO list
NOTE: the lists below are automatically generated based on searching the rest of the page above. Add new “to do” items in the main body of the page (above here), not here.
Items due within 7 days
Items due later
All items
research:internal_small_version |
---|
[lab] |