From 56cd921d459c44c555adc9ace687d916a4f2adba Mon Sep 17 00:00:00 2001 From: RedHawk989 Date: Mon, 12 May 2025 23:53:29 +0000 Subject: [PATCH] Update README.md --- 404.html | 2 +- about.html | 2 +- archive/fox_ir_v2_build_instructions.html | 2 +- assets/misc_faq.md.8d641c6c.js | 1 - assets/misc_faq.md.8d641c6c.lean.js | 1 - assets/misc_faq.md.a2c23752.js | 1 + assets/misc_faq.md.a2c23752.lean.js | 1 + contact.html | 2 +- dev_roadmap.html | 2 +- development/docs/dev_docs.html | 2 +- development/docs/pages.html | 2 +- development/docs/standards.html | 2 +- firmware_guide/configure_firmware.html | 2 +- firmware_guide/environments.html | 2 +- firmware_guide/firmware.html | 2 +- firmware_guide/flashing_tool.html | 2 +- firmware_guide/mdns.html | 2 +- firmware_guide/rest_api.html | 2 +- firmware_guide/setup_vscode.html | 2 +- firmware_guide/update_platformio.html | 2 +- firmware_guide/upload_and_update_firmware.html | 2 +- getting_started/intro.html | 2 +- getting_started/led_safety.html | 2 +- getting_started/things_to_know.html | 2 +- hashmap.json | 2 +- headset_guides/valve_index.html | 2 +- headset_guides/what_is_this.html | 2 +- how_to_build/3d_printed_mounts.html | 2 +- how_to_build/creating_your_own_mount.html | 2 +- how_to_build/full_build.html | 2 +- how_to_build/led_setup.html | 2 +- how_to_build/part_list.html | 2 +- how_to_build/parts_list.html | 2 +- how_to_build/preparing_cameras.html | 2 +- how_to_build/preparing_xiao.html | 2 +- how_to_build/v4_full_build.html | 2 +- index.html | 2 +- misc/faq.html | 6 +++--- misc/jlc3dp.html | 2 +- misc/vrc_avatar_setup.html | 2 +- software_guide/VRCFT_tracking_module.html | 2 +- software_guide/build_software.html | 2 +- software_guide/eyetrackvr_app_guide.html | 2 +- software_guide/osc_setup.html | 2 +- 44 files changed, 44 insertions(+), 44 deletions(-) delete mode 100644 assets/misc_faq.md.8d641c6c.js delete mode 100644 assets/misc_faq.md.8d641c6c.lean.js create mode 100644 assets/misc_faq.md.a2c23752.js create mode 100644 assets/misc_faq.md.a2c23752.lean.js diff --git a/404.html b/404.html index 1b4adc5..00cf761 100644 --- a/404.html +++ b/404.html @@ -15,7 +15,7 @@
Skip to content

404

PAGE NOT FOUND

But if you don't change your direction, and if you keep looking, you may end up where you are heading.

Released under the MIT License.

- + \ No newline at end of file diff --git a/about.html b/about.html index f0e28ca..6676a0c 100644 --- a/about.html +++ b/about.html @@ -18,7 +18,7 @@
Skip to content

Our Team

The development of EyeTrackVR is guided by an international team, some of whom have chosen to be featured below.

EyeTrackVR developers are a group of people who are passionate about the field of augmented and virtual reality.

Prohurtz

Prohurtz

Creator / Lead Software Developer / Documentor / Hardware Developer

Summer

Summer

Machine Learning Engineer / Data Scientist / App Developer

lorow

lorow

Lead Firmware Developer / App Developer

DaOfficialWizard

DaOfficialWizard

Firmware Developer / Documentation Manager / App Developer

Contributors

Those who have actively contributed to development.
Community Support

Philosophy

The guides on this website include some of our team's own notes (not all of them are polished) that we disclose for other people to use.

Here, we hope you may find something useful to you.

We advocate the Open Source model.

This is why we strive to make our work open to other people for consultation, replication and reuse.

Released under the MIT License.

- + \ No newline at end of file diff --git a/archive/fox_ir_v2_build_instructions.html b/archive/fox_ir_v2_build_instructions.html index 251b493..8315ea4 100644 --- a/archive/fox_ir_v2_build_instructions.html +++ b/archive/fox_ir_v2_build_instructions.html @@ -19,7 +19,7 @@
Skip to content

Fox IR V2 Buil dInstructions

Step 15: Prepare to solder IR LED PCB V2s

Get your magnifying glass out, it's time to solder very smol things.

Gather 4 PCBs, 4 IR LEDs, and 2 ~700ohm resistors.

'img'

698ohm resistors and V2 PCBs

Here are the PCB pin-out labels:

'img'

V2

LED labels:

'img'

The green markings and notched corners mark the positive sides of the LEDs pictured above.

If you have different LEDs, please consult their datasheet.

Some terminology related to them:

5V: 5-volt power in.

GND: Ground or power out.

AR: After-Resistor this is to be used as the power in on the 2nd PCB in series as resistors are not needed on the 2nd PCB since they are on the 1st one.

SNG: Single resistor, use this as 5V in if you are using only 1 ~700ohm resistor on V3 boards (not recommended).

Negative: This marks the negative side of the LED.

Positive: This marks the positive side of the LED.

Step 17: Solder resistors on PCB V2

You only need 1 PCB to have a resistor per eye.

'img'

Tin the resistor pads.

'img'

Hold the resistor in place.

'img'

Solder one end.

'img'

Flip to the other side of the resistor and solder it.

'img'

Solder LEDs on PCB V2

'img'

Tin the pads

'img'

Place the resistor on the pads in the correct orientation.

'img'

Solder each side of the resistor. Be careful not to solder at too high of a temp, recommended soldering temp is 230C with a max of 245C.

'img'

The LED is flipped in this Image, the green dot should face AWAY from the PCB.

Wire up the PCBs V2

Basic full wiring diagram of IR PCBs V2.

'drawing'

The PCB that receives the 5V power is the one with the resistor. The second one, which gets its power from the ground pin of the first, does not have a resistor on it and its power input pin is the AR pin (After-Resistor). The 2nd PCBs ground pin goes to the ground of the system, in the diagram it is the ESPs ground pin.

Released under the MIT License.

- + \ No newline at end of file diff --git a/assets/misc_faq.md.8d641c6c.js b/assets/misc_faq.md.8d641c6c.js deleted file mode 100644 index a7697a3..0000000 --- a/assets/misc_faq.md.8d641c6c.js +++ /dev/null @@ -1 +0,0 @@ -import{A as d}from"./chunks/Accordion.aff4964d.js";import{o as a,c as n,F as u,B as h,b as c,w as r,k as e,t as i,e as p,l as f,a as m,E as y}from"./chunks/framework.47aa8d5b.js";const _={faq:[{question:"What is the goal of this project?",answer:"To provide an open source, affordable VR eye tracker for Social games like VRChat as well as provide an open eye tracking platform.",hyper_link:"",link_description:""},{question:"What headsets will be supported?",answer:"Any headset that the hardware can fit in with community mounts or a mount you design.",hyper_link:"",link_description:""},{question:"How will this work?",answer:"Currently, a camera is mounted inside the headset for each eye. The camera streams through wifi to a PC client which processes and sends eye tracking data to an OSC endpoint ex VRChat.",hyper_link:"",link_description:""},{question:"What features will be supported?",answer:"The goal is eye tracking with eye openness, and some form of pupil dilation. A far away aspiration of this project is some form of weak foveated rendering because it's cool and any small performance increase in VR is welcome.",hyper_link:"",link_description:""},{question:"When will this be completed?",answer:"When it's done 😉 I have a semi busy life so development may slow and speed up inconsistently.",hyper_link:"",link_description:""},{question:"Will IR damage my eyes?",answer:"This project has safety in mind. If you do all of the safety measures that we put into place and visually test the amount of IR light you will be fine. Please note we have not finished development of all safety stuff so be careful.",hyper_link:"https://dammedia.osram.info/media/bin/osram-dam-2496608/AN002_Details%20on%20photobiological%20safety%20of%20LED%20light%20sources.pdf",link_description:" here is a pdf with safety information"},{question:"How expensive will this be?",answer:"My goal is to keep it as cheap as possible with around $75 as the absolute max, with current setups being around $30-80. aaaa",hyper_link:"",link_description:""},{question:"How do I set up my avatar?",answer:"Check out the VR Chat face tracking wiki on our github. Keep in mind that we currently only support float parameters. ",hyper_link:"",link_description:""},{question:"What hardware does this use / How do I build?",answer:"Checkout our build guides on this site for specific information, hardware may change and evolve over time.",hyper_link:"",link_description:""},{question:"Where are the docs?",answer:"All of our documentation is located on this website, all future documentation will be found here as well.",hyper_link:"",link_description:""}]},k={class:"font-semibold text-xl"},w={class:""},q={style:{"list-style":"none"}},g={key:0},b=["href"],v={__name:"FAQ",setup(l){return(o,t)=>(a(),n("div",null,[(a(!0),n(u,null,h(f(_).faq,s=>(a(),c(d,{class:"mb-4"},{title:r(()=>[e("span",k,i(s.question),1)]),content:r(()=>[e("div",null,[e("span",w,[e("blockquote",null,[e("ul",q,[e("li",null,i(s.answer),1),s.hyper_link!=o.NULL&&s.hyper_link!=""?(a(),n("p",g,[e("a",{href:s.hyper_link,target:"_blank"},i(s.link_description),9,b)])):p("",!0)])])])])]),_:2},1024))),256))]))}},V=JSON.parse('{"title":"Frequently Asked Questions","description":"","frontmatter":{},"headers":[],"relativePath":"misc/faq.md","filePath":"misc/faq.md","lastUpdated":1676223729000}'),x={name:"misc/faq.md"},W=Object.assign(x,{setup(l){return(o,t)=>(a(),n("div",null,[t[0]||(t[0]=e("h1",{class:"text-[var(--font-accent)]",id:"frequently-asked-questions",tabindex:"-1"},[m("Frequently Asked Questions "),e("a",{class:"header-anchor",href:"#frequently-asked-questions","aria-label":'Permalink to "Frequently Asked Questions {.text-[var(--font-accent)]}"'},"​")],-1)),t[1]||(t[1]=e("br",null,null,-1)),t[2]||(t[2]=e("hr",null,null,-1)),y(v)]))}});export{V as __pageData,W as default}; diff --git a/assets/misc_faq.md.8d641c6c.lean.js b/assets/misc_faq.md.8d641c6c.lean.js deleted file mode 100644 index a7697a3..0000000 --- a/assets/misc_faq.md.8d641c6c.lean.js +++ /dev/null @@ -1 +0,0 @@ -import{A as d}from"./chunks/Accordion.aff4964d.js";import{o as a,c as n,F as u,B as h,b as c,w as r,k as e,t as i,e as p,l as f,a as m,E as y}from"./chunks/framework.47aa8d5b.js";const _={faq:[{question:"What is the goal of this project?",answer:"To provide an open source, affordable VR eye tracker for Social games like VRChat as well as provide an open eye tracking platform.",hyper_link:"",link_description:""},{question:"What headsets will be supported?",answer:"Any headset that the hardware can fit in with community mounts or a mount you design.",hyper_link:"",link_description:""},{question:"How will this work?",answer:"Currently, a camera is mounted inside the headset for each eye. The camera streams through wifi to a PC client which processes and sends eye tracking data to an OSC endpoint ex VRChat.",hyper_link:"",link_description:""},{question:"What features will be supported?",answer:"The goal is eye tracking with eye openness, and some form of pupil dilation. A far away aspiration of this project is some form of weak foveated rendering because it's cool and any small performance increase in VR is welcome.",hyper_link:"",link_description:""},{question:"When will this be completed?",answer:"When it's done 😉 I have a semi busy life so development may slow and speed up inconsistently.",hyper_link:"",link_description:""},{question:"Will IR damage my eyes?",answer:"This project has safety in mind. If you do all of the safety measures that we put into place and visually test the amount of IR light you will be fine. Please note we have not finished development of all safety stuff so be careful.",hyper_link:"https://dammedia.osram.info/media/bin/osram-dam-2496608/AN002_Details%20on%20photobiological%20safety%20of%20LED%20light%20sources.pdf",link_description:" here is a pdf with safety information"},{question:"How expensive will this be?",answer:"My goal is to keep it as cheap as possible with around $75 as the absolute max, with current setups being around $30-80. aaaa",hyper_link:"",link_description:""},{question:"How do I set up my avatar?",answer:"Check out the VR Chat face tracking wiki on our github. Keep in mind that we currently only support float parameters. ",hyper_link:"",link_description:""},{question:"What hardware does this use / How do I build?",answer:"Checkout our build guides on this site for specific information, hardware may change and evolve over time.",hyper_link:"",link_description:""},{question:"Where are the docs?",answer:"All of our documentation is located on this website, all future documentation will be found here as well.",hyper_link:"",link_description:""}]},k={class:"font-semibold text-xl"},w={class:""},q={style:{"list-style":"none"}},g={key:0},b=["href"],v={__name:"FAQ",setup(l){return(o,t)=>(a(),n("div",null,[(a(!0),n(u,null,h(f(_).faq,s=>(a(),c(d,{class:"mb-4"},{title:r(()=>[e("span",k,i(s.question),1)]),content:r(()=>[e("div",null,[e("span",w,[e("blockquote",null,[e("ul",q,[e("li",null,i(s.answer),1),s.hyper_link!=o.NULL&&s.hyper_link!=""?(a(),n("p",g,[e("a",{href:s.hyper_link,target:"_blank"},i(s.link_description),9,b)])):p("",!0)])])])])]),_:2},1024))),256))]))}},V=JSON.parse('{"title":"Frequently Asked Questions","description":"","frontmatter":{},"headers":[],"relativePath":"misc/faq.md","filePath":"misc/faq.md","lastUpdated":1676223729000}'),x={name:"misc/faq.md"},W=Object.assign(x,{setup(l){return(o,t)=>(a(),n("div",null,[t[0]||(t[0]=e("h1",{class:"text-[var(--font-accent)]",id:"frequently-asked-questions",tabindex:"-1"},[m("Frequently Asked Questions "),e("a",{class:"header-anchor",href:"#frequently-asked-questions","aria-label":'Permalink to "Frequently Asked Questions {.text-[var(--font-accent)]}"'},"​")],-1)),t[1]||(t[1]=e("br",null,null,-1)),t[2]||(t[2]=e("hr",null,null,-1)),y(v)]))}});export{V as __pageData,W as default}; diff --git a/assets/misc_faq.md.a2c23752.js b/assets/misc_faq.md.a2c23752.js new file mode 100644 index 0000000..e92792d --- /dev/null +++ b/assets/misc_faq.md.a2c23752.js @@ -0,0 +1 @@ +import{A as d}from"./chunks/Accordion.aff4964d.js";import{o as a,c as n,F as u,B as c,b as h,w as r,k as e,t as i,e as p,l as f,a as m,E as y}from"./chunks/framework.47aa8d5b.js";const k={faq:[{question:"What is the goal of this project?",answer:"To provide an open source, affordable VR eye tracker for Social games like VRChat as well as provide an open eye tracking platform.",hyper_link:"",link_description:""},{question:"What headsets will be supported?",answer:"Any headset that the hardware can fit in with community mounts or a mount you design.",hyper_link:"",link_description:""},{question:"How will this work?",answer:"Currently, a camera is mounted inside the headset for each eye. The camera streams through wifi to a PC client which processes and sends eye tracking data to an OSC endpoint ex VRChat.",hyper_link:"",link_description:""},{question:"What features will be supported?",answer:"The goal is eye tracking with eye openness, and some form of pupil dilation. A far away aspiration of this project is some form of weak foveated rendering because it's cool and any small performance increase in VR is welcome.",hyper_link:"",link_description:""},{question:"When will this be completed?",answer:"When it's done 😉 I have a semi busy life so development may slow and speed up inconsistently.",hyper_link:"",link_description:""},{question:"Will IR damage my eyes?",answer:"This project has safety in mind. If you do all of the safety measures that we put into place and visually test the amount of IR light you will be fine. Please note we have not finished development of all safety stuff so be careful.",hyper_link:"https://look.ams-osram.com/m/1d720a7b18ab3fce/original/Details-on-photobiological-safety-of-LED-light-sources.pdf",link_description:" here is a pdf with safety information"},{question:"How expensive will this be?",answer:"My goal is to keep it as cheap as possible with around $75 as the absolute max, with current setups being around $30-80. aaaa",hyper_link:"",link_description:""},{question:"How do I set up my avatar?",answer:"Check out the VR Chat face tracking wiki on our github. Keep in mind that we currently only support float parameters. ",hyper_link:"",link_description:""},{question:"What hardware does this use / How do I build?",answer:"Checkout our build guides on this site for specific information, hardware may change and evolve over time.",hyper_link:"",link_description:""},{question:"Where are the docs?",answer:"All of our documentation is located on this website, all future documentation will be found here as well.",hyper_link:"",link_description:""}]},_={class:"font-semibold text-xl"},w={class:""},g={style:{"list-style":"none"}},q={key:0},b=["href"],v={__name:"FAQ",setup(l){return(o,t)=>(a(),n("div",null,[(a(!0),n(u,null,c(f(k).faq,s=>(a(),h(d,{class:"mb-4"},{title:r(()=>[e("span",_,i(s.question),1)]),content:r(()=>[e("div",null,[e("span",w,[e("blockquote",null,[e("ul",g,[e("li",null,i(s.answer),1),s.hyper_link!=o.NULL&&s.hyper_link!=""?(a(),n("p",q,[e("a",{href:s.hyper_link,target:"_blank"},i(s.link_description),9,b)])):p("",!0)])])])])]),_:2},1024))),256))]))}},V=JSON.parse('{"title":"Frequently Asked Questions","description":"","frontmatter":{},"headers":[],"relativePath":"misc/faq.md","filePath":"misc/faq.md","lastUpdated":1676223729000}'),x={name:"misc/faq.md"},W=Object.assign(x,{setup(l){return(o,t)=>(a(),n("div",null,[t[0]||(t[0]=e("h1",{class:"text-[var(--font-accent)]",id:"frequently-asked-questions",tabindex:"-1"},[m("Frequently Asked Questions "),e("a",{class:"header-anchor",href:"#frequently-asked-questions","aria-label":'Permalink to "Frequently Asked Questions {.text-[var(--font-accent)]}"'},"​")],-1)),t[1]||(t[1]=e("br",null,null,-1)),t[2]||(t[2]=e("hr",null,null,-1)),y(v)]))}});export{V as __pageData,W as default}; diff --git a/assets/misc_faq.md.a2c23752.lean.js b/assets/misc_faq.md.a2c23752.lean.js new file mode 100644 index 0000000..e92792d --- /dev/null +++ b/assets/misc_faq.md.a2c23752.lean.js @@ -0,0 +1 @@ +import{A as d}from"./chunks/Accordion.aff4964d.js";import{o as a,c as n,F as u,B as c,b as h,w as r,k as e,t as i,e as p,l as f,a as m,E as y}from"./chunks/framework.47aa8d5b.js";const k={faq:[{question:"What is the goal of this project?",answer:"To provide an open source, affordable VR eye tracker for Social games like VRChat as well as provide an open eye tracking platform.",hyper_link:"",link_description:""},{question:"What headsets will be supported?",answer:"Any headset that the hardware can fit in with community mounts or a mount you design.",hyper_link:"",link_description:""},{question:"How will this work?",answer:"Currently, a camera is mounted inside the headset for each eye. The camera streams through wifi to a PC client which processes and sends eye tracking data to an OSC endpoint ex VRChat.",hyper_link:"",link_description:""},{question:"What features will be supported?",answer:"The goal is eye tracking with eye openness, and some form of pupil dilation. A far away aspiration of this project is some form of weak foveated rendering because it's cool and any small performance increase in VR is welcome.",hyper_link:"",link_description:""},{question:"When will this be completed?",answer:"When it's done 😉 I have a semi busy life so development may slow and speed up inconsistently.",hyper_link:"",link_description:""},{question:"Will IR damage my eyes?",answer:"This project has safety in mind. If you do all of the safety measures that we put into place and visually test the amount of IR light you will be fine. Please note we have not finished development of all safety stuff so be careful.",hyper_link:"https://look.ams-osram.com/m/1d720a7b18ab3fce/original/Details-on-photobiological-safety-of-LED-light-sources.pdf",link_description:" here is a pdf with safety information"},{question:"How expensive will this be?",answer:"My goal is to keep it as cheap as possible with around $75 as the absolute max, with current setups being around $30-80. aaaa",hyper_link:"",link_description:""},{question:"How do I set up my avatar?",answer:"Check out the VR Chat face tracking wiki on our github. Keep in mind that we currently only support float parameters. ",hyper_link:"",link_description:""},{question:"What hardware does this use / How do I build?",answer:"Checkout our build guides on this site for specific information, hardware may change and evolve over time.",hyper_link:"",link_description:""},{question:"Where are the docs?",answer:"All of our documentation is located on this website, all future documentation will be found here as well.",hyper_link:"",link_description:""}]},_={class:"font-semibold text-xl"},w={class:""},g={style:{"list-style":"none"}},q={key:0},b=["href"],v={__name:"FAQ",setup(l){return(o,t)=>(a(),n("div",null,[(a(!0),n(u,null,c(f(k).faq,s=>(a(),h(d,{class:"mb-4"},{title:r(()=>[e("span",_,i(s.question),1)]),content:r(()=>[e("div",null,[e("span",w,[e("blockquote",null,[e("ul",g,[e("li",null,i(s.answer),1),s.hyper_link!=o.NULL&&s.hyper_link!=""?(a(),n("p",q,[e("a",{href:s.hyper_link,target:"_blank"},i(s.link_description),9,b)])):p("",!0)])])])])]),_:2},1024))),256))]))}},V=JSON.parse('{"title":"Frequently Asked Questions","description":"","frontmatter":{},"headers":[],"relativePath":"misc/faq.md","filePath":"misc/faq.md","lastUpdated":1676223729000}'),x={name:"misc/faq.md"},W=Object.assign(x,{setup(l){return(o,t)=>(a(),n("div",null,[t[0]||(t[0]=e("h1",{class:"text-[var(--font-accent)]",id:"frequently-asked-questions",tabindex:"-1"},[m("Frequently Asked Questions "),e("a",{class:"header-anchor",href:"#frequently-asked-questions","aria-label":'Permalink to "Frequently Asked Questions {.text-[var(--font-accent)]}"'},"​")],-1)),t[1]||(t[1]=e("br",null,null,-1)),t[2]||(t[2]=e("hr",null,null,-1)),y(v)]))}});export{V as __pageData,W as default}; diff --git a/contact.html b/contact.html index 95f7864..4df72e3 100644 --- a/contact.html +++ b/contact.html @@ -18,7 +18,7 @@
- + \ No newline at end of file diff --git a/dev_roadmap.html b/dev_roadmap.html index 92eeaff..ba5f017 100644 --- a/dev_roadmap.html +++ b/dev_roadmap.html @@ -19,7 +19,7 @@
Skip to content

Released under the MIT License.

- + \ No newline at end of file diff --git a/development/docs/dev_docs.html b/development/docs/dev_docs.html index 8040fba..2ba0d7f 100644 --- a/development/docs/dev_docs.html +++ b/development/docs/dev_docs.html @@ -18,7 +18,7 @@
Skip to content

Development Documentation for the EyeTrackVR Docs Site

This is the documentation for the EyeTrackVR Docs site. It is built using VitePress.

Getting Started

Prerequisites

  • Node.js (v14.15.4 or higher)
  • Yarn (v1.22.10 or higher)
  • Git (v2.30.1 or higher)
  • VSCode (v1.56.2 or higher)

Installation

  1. Clone the repo

    sh
    git clone https://github.com/EyeTrackVR/EyeTrackVR-Docs.git
  2. Navigate to the vitepress folder

    sh
    cd vitepress
  3. Install Yarn packages

    sh
     yarn
  4. Start the dev server

    sh
    yarn dev
  5. Open the site in your browser

Contributing

Project Structure

  1. Familiarize yourself with the VitePress documentation.
  2. Then, move on to our project specific documentation below.

Standards

  • All documentation should be written in Markdown or Vue components.
  • All file names are snake case and lowercase letters.
  • Do not make naming changes to the vitepress folder.
  • Do not make major changes to the vitepress folder structure without proir consultation of team members.
  • For Vue componentes, follow the Vue3 docs.

Released under the MIT License.

- + \ No newline at end of file diff --git a/development/docs/pages.html b/development/docs/pages.html index 3f59436..3db6734 100644 --- a/development/docs/pages.html +++ b/development/docs/pages.html @@ -26,7 +26,7 @@ { text: "My Page", link: "/my_folder/my_page" }, // Add this line - subdir then file ], },

Note

The link property does not require a file extension. Please do not add it.

Title Style

The title style is the style of the title that appears at the top of the page.

We like to keep this cohesive, so please use the following style:

md
# My Page {.text-[var(--font-accent)]}

This will give the title a nice orange colour.

The {} are required, and the .text-[#e67e22] is the colour using TailWindCSS classes. You can also use our built in CSS variables {.text-[var(--font-accent)]}. You will find these in the src/styles/theme.css file. You can change this to any colour you like, but please keep it consistent with the rest of the site.

Tip

This entire site supports TailWindCSS classes. You can find the documentation here.

All classes you wish to add, must be prefaced with a . when inside of the {}.

When using classes on HTML elements, you can use the class attribute. For example:

html
<h1 class="text-[#e67e22]">My Page</h1>

Editing a Page

To edit a page, you will need to edit the Markdown file in the vitepress/docs/src/pages folder.

- + \ No newline at end of file diff --git a/development/docs/standards.html b/development/docs/standards.html index 8e52ad7..4f4c254 100644 --- a/development/docs/standards.html +++ b/development/docs/standards.html @@ -26,7 +26,7 @@ - added new thing # some detail about the new thing BREAKING CHANGE: this is a breaking change #this line is optionaland only used if needed - + \ No newline at end of file diff --git a/firmware_guide/configure_firmware.html b/firmware_guide/configure_firmware.html index 9f34651..f6b740e 100644 --- a/firmware_guide/configure_firmware.html +++ b/firmware_guide/configure_firmware.html @@ -20,7 +20,7 @@
Skip to content

Configuring the firmware

Once you have opened the project, you should see something on the left side like this

'img'

Open the ini/user_config.ini file

'img'

INFO

These settings are applicable only when using Wi-Fi for communication with your computer. If you are using USB, you can skip to reading about Environments.

Replace the placeholder text with your correct SSID (WiFi access point name), and password respectively.

INFO

Special characters such as ! and @ are not supported. If you have a special character in your password or ssid, you will need to change it.

Similarly, spaces are not supported in the SSID and password. If you have a space in either, you will need to change it.

The firmware supports AP mode, however it is not recommended. If you wish to use AP mode, you will need to set the enableadhoc to 1.

Then, you will need to set the ap_ssid to the name of your AP, and ap_password to the password of your AP.

You can choose to leave the ap_ssid and ap_password as defaults, but you will need to set the enableadhoc to 1.

Note: If you are enabling AP mode for both ESPs, you will need to set the ap_ssid and ap_password to different values for both ESPs. It is best to set AP mode only for one ESP, and leave the other ESP in STA mode. Connect the other ESP to the AP of the ESP in AP mode. Then connect your computer to the AP of the ESP in AP mode. This will allow you to connect to the ESP in AP mode, and the ESP in STA mode.

CAUTION

Make sure your WiFi router has a 2.4 GHz band. While most do, this is not always the case. Setting each band (5GHz, and 2.4GHz) to different SSIDs is recommended, though not required.

Double check that you have correctly entered your WiFi credentials and that said wifi network has a 2.4GHz band.

Additional configuration

mDNS

If you do not wish to manually keep track of the ESPs IP addresses and ports, you can enable the mDNS feature. This will allow you to connect to the ESPs using the following format: http://<some_name>.local. This feature only works when you are on the same network as the ESPs and have mDNS enabled on your computer. If you are using Windows, you can enable mDNS by following this guide.

OTA

The firmware supports OTA updates. This means that you can update the firmware without having to connect to the ESPs. To enable OTA updates, you will need to set the enableota to 1. Then, you will need to set the otapassword to the password you wish to use to update the firmware. You will need to use this password to update the firmware. If you do not wish to use a password, you can set the otapassword to nothing. This will allow you to update the firmware without a password. However, this is not recommended, as anyone on your network will be able to update the firmware.

You will also need to set the otaserverip to the IP address of the ESP you wish to update. This is the IP address of the ESP you wish to update. If you are using mDNS, you can use the mDNS name of the ESP you wish to update. For example, if you are updating the ESP with the dns set to http://esp32.local, you will need to set the otaserverip to http://esp32.local.

REST API

The firmware also supports a fully featured REST API. This means that you can control the ESPs using a REST API. This feature is enabled by defauly, and can not be turned off.

This feature is used more for advanced users, and is not required for basic operation.

The REST API was developed to be used by our new app (still in development). However, it can be used by any REST API client, you can not send POST requests to the ESPs using a browser though, so some REST API functionality can only be used by tools like Thunder Client or Postman.

The full REST API is documented here.

Environments

The firmware supports multiple environments. This means that we have full support for multiple types of ESPs. Each ESP is configured to be in a different environment. If you are using a different ESP, you will need to change the environment to the correct one.

Please reference the Firmware Enviroments Page for more information.

Now, move on to uploading the firmware

Released under the MIT License.

- + \ No newline at end of file diff --git a/firmware_guide/environments.html b/firmware_guide/environments.html index fb1393a..ffee916 100644 --- a/firmware_guide/environments.html +++ b/firmware_guide/environments.html @@ -19,7 +19,7 @@
Skip to content

Firmware Environments

What is it?

A firmware environment is a way to store the custom firmware config for a specific device. This is useful if you want to have multiple devices with different firmware configs.

We use environments to store the firmware config for each device, so that you can easily switch between them.

Environments are broken up into two distinct parts:

  • The build type (e.g. debug, release)
  • The board type (e.g. espaithinker, esp32cam)

Build Types

We now have a couple of options letting us decide how we want to build the firmware.

  • Debug - The debug environment is the default environment and does not need to be specified. It has a lot of logging so it is useful for getting everything setup for the first time and to see what is going on.

  • _release - Has a lot less debugging, may also be missing some things available only in debug for debug purposes, this should be flashed when everything is working.

Examples:
xiaosenses3 - This is the debug environment.
xiaosenses3_release - This is the release environment for daily use.

Tip

We recommend starting with a debug environment, and then switching to release once everything is working. A debug environnement should not be used permanently.

Board Types

We currently support for several different boards, and we are working on adding more.

Warning

It is important to note that the esp32AIThinker environment is the default environment. Only switch your environment if you have another board or an environment is not working.

Most esp32Cams from Aliexpress will work with the esp32AIThinker environment, but some may require the esp32cam environment. Test the default environment first, and if it does not work, try the esp32cam environment.

Currently supported boards:

  • esp32AIThinker - This is the default environment. This is for the ESP32-AI-THINKER and generic alibaba/aliexpress/amazon esp cam boards.
  • esp32Cam - This is for the special ESP32-CAM, it is unlikely that you will need to use this environment.
  • esp32M5Stack - This is for ESP32M5Stack.
  • esp32WRover - This is for the ESP32WRover.
  • esp-eye - This is for the ESP-EYE (not the S3 variant).
  • wrooms3 - For FREENOVE-ESP32-S3 (wireless mode)
  • wrooms3QIO - For FREENOVE-ESP32-S3 (wireless mode, for boards with octal flash)
  • wrooms3USB - For FREENOVE-ESP32-S3 (wired mode)
  • wrooms3QIOUSB - For FREENOVE-ESP32-S3 (wired mode, for boards with octal flash)
  • xiaosenses3 - For SeedStudio's XAIO ESP32-S3 Sense (wireless mode)
  • xiaosenses3_USB - For SeedStudio's XAIO ESP32-S3 Sense (wired mode)

You can change the board enviroment by changing the default_envs argument in platformio.ini to a supported board like shown:

GIF showing a switch of board environments.

Released under the MIT License.

- + \ No newline at end of file diff --git a/firmware_guide/firmware.html b/firmware_guide/firmware.html index cffcf62..d2e2cca 100644 --- a/firmware_guide/firmware.html +++ b/firmware_guide/firmware.html @@ -18,7 +18,7 @@
Skip to content

What is this?

Firmware is the second part of the equation to get your trackers going. It lives on the ESP32 chip, and is responsible for streaming video data from the tracker. We currently fully support wireless streaming over 2.4GHz, and are working on adding support for wired streaming to the main desktop App.

What should I download and where?

You can access the firmware, as well as flash your boards, using our new Firmware Flashing Tool.

Caution

This tool is currently in beta.

We are working on adding support for Over-The-Air (OTA) and configuring network settings. If you have any issues, please let us know by opening an issue on the repository.

This tool relies on our own solution called OpenIris, found here.

Note

You do not need to download OpenIris separately. The flashing tool will download it for you.

To check the status of the project, please visit the Development Road Map

I want to use VSCode - How would I do this?

Follow the steps described here

Released under the MIT License.

- + \ No newline at end of file diff --git a/firmware_guide/flashing_tool.html b/firmware_guide/flashing_tool.html index 1ac19c6..f8103fb 100644 --- a/firmware_guide/flashing_tool.html +++ b/firmware_guide/flashing_tool.html @@ -19,7 +19,7 @@
Skip to content

Firmware Flashing Tool

Before we proceed

Note

Step 1: Download and Install the Firmware Flashing Tool

  • Download the Firmware Flashing Tool Installer from the latest GitHub release.
  • Run the installer and follow the on-screen instructions to complete the installation of the Firmware Flashing Tool app.

Step 2: Run the Firmware Flashing Tool app

After the installation is complete, you should see an icon on your desktop. To open the application, double-click the icon.

You then should be greeted with a GUI that looks like this:

Flashing Tool ui lower half

Step 3: Select board.

We now have a couple of options letting us decide how we want to build the firmware.

  • Debug
    This is the default environment and does not need to be explicitly chosen. It includes extensive logging, making it ideal for initial setup and troubleshooting, as it provides comprehensive visibility into the system’s operation.

  • _release
    Has a lot less debugging, may also be missing some things available only in debug for debug purposes, this should be flashed when everything is working.

Press left click on the Select Board dropdown, then choose your desired board from the list. Once you've selected your board, click Confirm to move on to the next step.

preview of manage boards

Step 4: Flash firmware.

Once you're ready, hold down the B or IO0 button on your board and connect it to your PC. (The button may be small, but it's there). Next, click Install OpenIris and select the port corresponding to your board to complete the installation.

preview of how to install openiris

Step 5: That's it!.

Installation complete! Everything is set up and ready to go.
Click Show Logs to view the results.

preview of flashing success

Step 6: Logs.

If you choose the wired option, the correct logs should appear as follows.
If the logs don't look right, ensure that your camera is properly connected to the board.

logs

How to configure wifi.

Note

  • Please note that 5GHz networks are not supported by the hardware, so you will need to use a 2.4GHz network.

Step 1: Select board

Ensure that the selected board has a wireless mode checkmark below it.

preview of manage boards

Step 2: Configure wifi network

You'll need to provide information about the network you're planning to connect to.

SSID
The SSID is your Wi-Fi name, make sure it doesn't contain any special characters.

Password
You'll need to provide the password used to connect to the selected Wi-Fi network.

How to send wifi

Step 3: Send wifi credentials.

Once the installation is complete, unplug your board, then reconnect it to the PC without pressing any buttons and press Send credentials.

how to setup credentials

Step 4: Get working stream.

After sending is complete, disconnect your board, then reconnect it to the PC without pressing any buttons, and click Show logs.
Scroll down to find the text labeled IP: 192.168.XXX.XXX.

camera preview

This IP can then be entered into the app or a web browser like: http://192.168.XXX.XXX

Released under the MIT License.

- + \ No newline at end of file diff --git a/firmware_guide/mdns.html b/firmware_guide/mdns.html index 3708485..1cc5a5f 100644 --- a/firmware_guide/mdns.html +++ b/firmware_guide/mdns.html @@ -18,7 +18,7 @@
Skip to content

MDNS

What is it?

mDNS is a protocol that allows you to connect to a device using a name instead of an IP address. This is useful if you do not know the IP address of the device, or if you do not want to keep track of the IP address of the device. This is also useful if you are using a device that does not have a static IP address.

In the mDNS protocol the IP address of the device can change, but the name of the device will always be the same. This means that you can connect to the device using the name of the device, even if the IP address of the device changes.

How to use it

Enable mDNS

The mDNS is enabled by default on the new firmware, and can not be disabled.

To use it, all you need to do is set the name of the device in the firmware config file.

This setting can be located under the [wifi] section of the ini/user_config.ini file.

ini
mdnsname = "openiristracker" # do not add .local

By default, the name of the device is openiristracker, however you can change it to whatever you want.

Change the name

Since you have two ESP32 devices, you need to make sure that the name of the device is different for each device. If you do not change the name of the device, you will not be able to connect to both devices at the same time.

Connect to the device

To connect to the device, you need to use the name of the device followed by .local. For example, if the name of the device is esp32, you can connect to the device using http://esp32.local.

Prerequisites Bonjour

Bonjour is required to use the mDNS protocol for windows and mac only. If you are using Windows, you can download Bonjour from here. If you are using macOS, Bonjour is already installed.

You can also get the fully tested Bonjour package from us here (for Windows only).

Troubleshooting

Can not connect to the device

If you can not connect to the device, make sure that the name of the device is correct. If the name of the device is correct, make sure that the device is connected to the network.

If the device is connected to the network, try to restart the device. If the device is still not connected, try to restart the router.

Released under the MIT License.

- + \ No newline at end of file diff --git a/firmware_guide/rest_api.html b/firmware_guide/rest_api.html index e4a8ead..f3fb89f 100644 --- a/firmware_guide/rest_api.html +++ b/firmware_guide/rest_api.html @@ -44,7 +44,7 @@ WIFI_POWER_2dBm = 8,// 2dBm WIFI_POWER_MINUS_1dBm = -4// -1dBm } wifi_power_t;
ParamDescription
txPowerThe power level to set.

Camera Params

ParamDescriptionValue Effect
vflipWhether to flip the frames vertically.0 or 1
framesizeA value between 0-7 indicating the frame resolution.Larger number - higher resolution.
hflipWhether to flip the frames horizontally.0 or 1
qualityThe JPEG quality level: 1-63?Smaller number = higher quality, more latency and less fps
brightnessThe agc_gain of the camera.Larger number = more bright.
- + \ No newline at end of file diff --git a/firmware_guide/setup_vscode.html b/firmware_guide/setup_vscode.html index 52f4d65..fedff1e 100644 --- a/firmware_guide/setup_vscode.html +++ b/firmware_guide/setup_vscode.html @@ -19,7 +19,7 @@
Skip to content

Setting up the environment

This procedure will show how to prepare your system for uploading the firmware to your tracker.

1. Install Visual Studio Code

Download the latest Visual Studio Code and install it.

Download


'img'

Install


'img'

2. Install the drivers

In order to flash the firmware, you'll need some drivers, mainly the CH340 drivers, here's where to get them from:

https://learn.sparkfun.com/tutorials/how-to-install-ch340-drivers/all

3. Install PlatformIO IDE

Once Visual Studio Code is installed, open it and install PlatformIO IDE for VSCode, an extension that will allow you to connect to the tracker, build and upload the firmware.

'img'

4. Clone the firmware project

Make sure you close any current projects you have open or open a new window before moving forward with these steps.

  1. Go to https://github.com/EyeTrackVR/OpenIris and clone the latest version from the main branch.
    1. If you do not have git installed please install it from here.

    2. Open Git Bash.

'img'
  1. Change the directory to the directory that you want the firmware to be cloned. Ex: cd C:/
'img'
  1. Clone the repository by entering the command: git clone https://github.com/EyeTrackVR/OpenIris.git
'img'

For more info about cloning please refer to this documentation

Do not download as a zip - please only properly clone the project, take note of the path you run the clone command in, this is where the files will be downloaded. When we open it in PlatformIO later, this path will be needed.

  1. Open the firmware in VSCode by going to PlatformIO, selecting open, then navigating to OpenIris/ESP folder and opening it.
'img'

This is an adaptation from SlimeVR. Some Credit goes to the SlimeVR team adapted from here

Released under the MIT License.

- + \ No newline at end of file diff --git a/firmware_guide/update_platformio.html b/firmware_guide/update_platformio.html index b51311c..9fd4e56 100644 --- a/firmware_guide/update_platformio.html +++ b/firmware_guide/update_platformio.html @@ -19,7 +19,7 @@
Skip to content

Updating Platformio

Sometimes, when building and uploading the firmware, you may run into bizarre, even-though the code is correct - platformio will refuse to build and instead will complain with some weird error.

It usually means that either PlatformIO, or ArduinoSDK got and update and that's the moment when this guide may come in handy.

Updating PlatformIO itself

There are two ways to update your platformio installation

Via CLI

You can update it easily by simply typing in one command into your terminal.

pio upgrade command in the terminal
  1. Open a terminal / command prompt
  2. Type pio upgrade and press enter
  3. Wait for it to finish
  4. Done

Via PlatformIO in Visual Studio Code

You can also update it using the platformio tab in your Visual Studio Code.

  1. Open VSC
  2. Click on that little and icon on the sidebar
Platformio logo button
  1. A side panel will open, there will be a couple of sections, one of them called Quick Access, open it.
Platformio logo Quick Access Section
  1. In Quick Access section, click on the Miscelleneous tab
Platformio miscellaneous section
  1. Then, click on the Upgrade PlatformIO Core
Platformio upgrade core button
  1. Wait for it to finnish and done!

Upgrading the SDK

You'll also need to upgrade the SDK from timem to do this

Upgrading the SDK via VSC

  1. Follow the guide above up until clicking on the Quick Access section.
  2. Instead, open the PIO Home and select Open
Platformio open home
  1. This will open a window for PlatformIO dashboard, in there, click on a Platforms button on the sidebar
Platformio platforms sidebar
  1. Once there, you should see a couple of tabs next to the side bard, things like Installed or Embedded. Click on Updates
Platformio platforms sections
  1. Locate a platform called Espressif 32 and click on Update to <version>
Platformio platforms update view
  1. Wait until it is done and that's it!

Released under the MIT License.

- + \ No newline at end of file diff --git a/firmware_guide/upload_and_update_firmware.html b/firmware_guide/upload_and_update_firmware.html index f591458..e5bfaa3 100644 --- a/firmware_guide/upload_and_update_firmware.html +++ b/firmware_guide/upload_and_update_firmware.html @@ -24,7 +24,7 @@ otaserverip = "openiristrackerL.local" # here we use a custom mDNS name otapassword = "12345678" otaserverport = 3232
  1. Change to OTA env

To do that, in visual studio code, locate the uplaod button, next to it will be listed a couple of buttons and your current environment.

For example, if you have a working esp32AIThinker environment, you would change your environment to esp32AIThinker_OTA.

Click on it.

'An example of how the popup list of envs looks like'

This will open a list of all available environments, select the one that matches your board and has a _OTA suffix

'An example of how the button for selecting env looks like'
  1. Restart the ESPs, they must be power cycled
  2. Press the upload button to upload the firmware.

    img
  3. Do not touch esps or move them during OTA upload
  4. Wait around 1 minute.
  5. Repeat for as many trackers as you need.

Finding the IP address of your tracker

Connect your tracker to your PC and then open a serial monitor in VSC by pressing the plug icon.

'platformio serial monitor button'

Now, press the restart button on the esp itself. watch the monitor for output like this:

'img of platformio serial monitor'

The text, highlighted in green for demonstration, is the stream address of the camera. Take note of this for input into the software.

Keep in mind while testing and getting set up, the ESP can only have one client, to use the camera in the app, make sure you close the browser tab you tested it in.

Updating the firmware to the newest version

To update the firmware you'll need to follow a couple of steps.

  1. Open a terminal and enter the directory you've cloned OpenIris into, for example cd C:/OpenIris/

  2. Save your current changes to the config using git stash

  3. Download the newest changes from the repo using git pull

  4. Apply your saved changes so that you don't have to retype your WI-FI credentials using git stash apply

CAUTION

Skip this step if the update message denotes that the config had changed. If it does so, retype your credentials in the updated files.

4.1. If anything went wrong, you can reset everything to the default state using git reset --hard and then retyping your credentials

  1. Upload your firmware following steps from the uploading section

Troubleshooting

If you encountered an issue while following these steps check the FAQ.

If you don't find an answer to your question there ask in #questions channel in the discord, we will be happy to help.

Adapted from the SlimeVR docs, Some Credit goes to the SlimeVR team here

- + \ No newline at end of file diff --git a/getting_started/intro.html b/getting_started/intro.html index 90b3abb..1bdc238 100644 --- a/getting_started/intro.html +++ b/getting_started/intro.html @@ -20,7 +20,7 @@
Skip to content

EyeTrackVR

EyeTrackVR is a Source First, affordable eye-tracking platform designed specifically for Virtual Reality—especially for Social VR experiences like VRChat. It works by sending real-time eye tracking data over OSC or UDP, making it compatible with many existing VR applications.

Whether you're here to build your own tracker, contribute to development, or explore how it works, you're in the right place.


Get Started

This documentation covers everything from assembly and firmware setup to headset mounting and safety. If you're new to the project, we recommend starting with the following pages:

CAUTION

This project is under active development, but it's already working reliably for many users. Your feedback helps us improve!


Eye Safety Notice

Eye safety is a top priority. Infrared (IR) light is invisible but not harmless—especially at close distances and high intensities. EyeTrackVR has built-in safety measures, but users must also follow proper precautions.

Always use the recommended LEDs, wiring, and firmware settings.

DANGER

Never bypass safety features or swap out IR components without understanding the risks. Doing so could result in permanent eye damage.

Our default configuration stays well below international safety thresholds (see ICNIRP guidelines), but it's your responsibility to:

  • Use non-focused emitters
  • Keep total radiant intensity below 5 mW/sr
  • Verify LED brightness with an IR camera
  • Stop using the device if you feel warmth or discomfort in your eyes
WARNING
Please pay attention

Use only non-focused LEDs, and ensure total output is 4 mW/sr or less.

Safety References

Effect of infrared radiation on the lens
Photobiological safety of LED light sources
ICNIRP Guidelines (0.38 to 3 µm)
Training Library – NIR Standards


Hardware Overview

See our Parts List and Hardware Repository to source components, print mounts, and prepare your setup.

Firmware

The firmware powering EyeTrackVR's ESP32 cameras is called OpenIris, developed by lorow.
You can find it here on GitHub.


Headset Compatibility

EyeTrackVR can work with any VR headset—but not every model has a 3D-printed mount available yet.

Check our mount list to see if your headset is supported.
If not, you’re welcome to design your own or wait for a community-sourced version.


Community & Support

Join our Discord server for updates, and community projects. We’re excited to have you on board!

'discord banner'

Released under the MIT License.

- + \ No newline at end of file diff --git a/getting_started/led_safety.html b/getting_started/led_safety.html index 21ac431..d21fd74 100644 --- a/getting_started/led_safety.html +++ b/getting_started/led_safety.html @@ -35,7 +35,7 @@ Ie = Ie₍20ₘₐ₎ × I / 20 mA E (per LED) = Ie / A E_total = E × (# of LEDs)

Appendix B – Quoted Guideline (ICNIRP 2013)

“To avoid thermal injury of the cornea and possible delayed effects on the lens of the eye (cataractogenesis), infrared radiation (780 nm < λ < 3 µm) should be limited to 100 W m⁻² (10 mW cm⁻²) for lengthy exposures (> 1000 s).”

ICNIRP Guidelines on Limits of Exposure to Incoherent Optical Radiation (2013)
PDF


If you have suggestions, believe these calculations are inaccurate, or wish to contribute improvements, please open a discussion on GitHub. Your feedback helps us keep everyone safe.

- + \ No newline at end of file diff --git a/getting_started/things_to_know.html b/getting_started/things_to_know.html index 09e0ce7..041e871 100644 --- a/getting_started/things_to_know.html +++ b/getting_started/things_to_know.html @@ -18,7 +18,7 @@
Skip to content

Things To Know Before You Start

This page is an overview of things to know and understand before you start this project

Software is not final

Bugs can and will occur which may hinder the experience or usability at times. Do not expect 100% reliable and perfect function. Not all features are fully developed yet.

Firmware may have user difficulties

As our firmware matures, bugs may creep up that make the process not flawless.

IR emitters can be dangerous

If handled wrong or precautions are ignored you can and will cause damage to your eyes. Please make sure to only buy linked emitters at calculated power or ones with specs that match exactly. We will not be responsible for damage caused if you go your own route. Our official kits are designed to be well within safety limits when used as designed.

Documentation may be incomplete

Important steps may be missing or unclear. We are working to bring documentation to completeness but it is a large undertaking and takes time. Feel free to ask questions in our Discord if things are unclear.

Once you have acknowledged the items above, move on to the Full Build Guide

Released under the MIT License.

- + \ No newline at end of file diff --git a/hashmap.json b/hashmap.json index 5324f48..6ebfa87 100644 --- a/hashmap.json +++ b/hashmap.json @@ -1 +1 @@ -{"contact.md":"034a53d4","firmware_guide_firmware.md":"6c8e1ae6","firmware_guide_environments.md":"6b0604e7","firmware_guide_flashing_tool.md":"c180a2ca","firmware_guide_mdns.md":"8a95e855","archive_fox_ir_v2_build_instructions.md":"ebf3c880","getting_started_led_safety.md":"78277ff3","how_to_build_parts_list.md":"d57cd54a","how_to_build_creating_your_own_mount.md":"70ac1088","how_to_build_full_build.md":"f324b1f6","getting_started_intro.md":"d6d8f855","firmware_guide_rest_api.md":"8a0ea54a","getting_started_things_to_know.md":"4cd89e47","headset_guides_valve_index.md":"891da17f","headset_guides_what_is_this.md":"60052a7d","dev_roadmap.md":"2f96c918","firmware_guide_upload_and_update_firmware.md":"0806ea0f","how_to_build_3d_printed_mounts.md":"5d2aa5cb","software_guide_vrcft_tracking_module.md":"d148b9dc","software_guide_build_software.md":"b0d1c681","development_docs_pages.md":"85e22694","how_to_build_v4_full_build.md":"0dd3c5e7","misc_jlc3dp.md":"5e8875c4","development_docs_standards.md":"f9558435","development_docs_dev_docs.md":"b7b0d76e","how_to_build_part_list.md":"3640f4dd","how_to_build_preparing_cameras.md":"0a4aab5a","how_to_build_preparing_xiao.md":"fcacda59","how_to_build_led_setup.md":"6d99addb","index.md":"269511a8","misc_faq.md":"8d641c6c","misc_vrc_avatar_setup.md":"2d1bc492","about.md":"055b7111","software_guide_osc_setup.md":"af1d1fce","firmware_guide_setup_vscode.md":"fefbbc55","firmware_guide_update_platformio.md":"d52a99e0","firmware_guide_configure_firmware.md":"4eb536cf","software_guide_eyetrackvr_app_guide.md":"dce4a9fb"} +{"development_docs_pages.md":"85e22694","firmware_guide_update_platformio.md":"d52a99e0","about.md":"055b7111","firmware_guide_setup_vscode.md":"fefbbc55","getting_started_led_safety.md":"78277ff3","how_to_build_led_setup.md":"6d99addb","firmware_guide_environments.md":"6b0604e7","headset_guides_valve_index.md":"891da17f","how_to_build_3d_printed_mounts.md":"5d2aa5cb","firmware_guide_configure_firmware.md":"4eb536cf","contact.md":"034a53d4","development_docs_standards.md":"f9558435","misc_jlc3dp.md":"5e8875c4","misc_vrc_avatar_setup.md":"2d1bc492","getting_started_things_to_know.md":"4cd89e47","how_to_build_part_list.md":"3640f4dd","index.md":"269511a8","software_guide_build_software.md":"b0d1c681","software_guide_eyetrackvr_app_guide.md":"dce4a9fb","software_guide_osc_setup.md":"af1d1fce","getting_started_intro.md":"d6d8f855","firmware_guide_rest_api.md":"8a0ea54a","firmware_guide_upload_and_update_firmware.md":"0806ea0f","how_to_build_v4_full_build.md":"0dd3c5e7","how_to_build_preparing_xiao.md":"fcacda59","how_to_build_preparing_cameras.md":"0a4aab5a","software_guide_vrcft_tracking_module.md":"d148b9dc","firmware_guide_mdns.md":"8a95e855","firmware_guide_firmware.md":"6c8e1ae6","dev_roadmap.md":"2f96c918","development_docs_dev_docs.md":"b7b0d76e","how_to_build_full_build.md":"f324b1f6","how_to_build_parts_list.md":"d57cd54a","firmware_guide_flashing_tool.md":"c180a2ca","misc_faq.md":"a2c23752","headset_guides_what_is_this.md":"60052a7d","archive_fox_ir_v2_build_instructions.md":"ebf3c880","how_to_build_creating_your_own_mount.md":"70ac1088"} diff --git a/headset_guides/valve_index.html b/headset_guides/valve_index.html index b45235b..848751f 100644 --- a/headset_guides/valve_index.html +++ b/headset_guides/valve_index.html @@ -18,7 +18,7 @@
Skip to content

Valve Index

These are two proven/tested ways to do a clean ETVR build on the Valve Index. Don't treat this as a must follow, but a setup to go for if you don't have other plans.

Physics-Dude's 160° V18 (mirror the STL for left)

Foow17's 130 / 160°

Foow17's 130 / 160° VROPTICIAN

Frosty704's 160/120° WidmoVR Mount

Please note due to the LED positioning, these will require you to use the included 110mm extensions included in the V4 Kit

V4 LED Kit Assembly

MUTEtheCyberwolf's DEV Frunk Mod:

DevFrunk1HMD

Mod Details

The DEV Frunk is a popular choice for ETVR on the Index. Replacing the original index frunk entirely, it has multiple mounting points for both the XIAO's, a Vive Facial Tracker (or project babble!), and cutouts for 30x30x7mm fans.

It also provides excellent ventilation simply due to the design being much more open, as well as a flip up design for the face tracker mount.

Combined, it creates a clean setup with no need for hot glue or alternatives to mount your hardware.

Extra Parts Needed:

I recommend a screw kit like this one

  • 1x T5 Torx Screwdriver
  • 1x H3.0 Screwdriver
  • 1x Soldering Iron for heat inserts
  • 8x M4x8.1 Heat Inserts AliExpress Choose Size: M4 (OD 6mm)50pcs | Color: Length 8mm
  • 6x M4x10 Screws AliExpress Choose Size: M4 20pcs | Length: 10mm
  • 1x M4x30 Screws AliExpress Choose Size: M4 20pcs | Length: 30mm

3D Printed Mounts

Found at MUTEtheCyberwolf's GitHub Repo

You will need to print:

  • 1x Coverplate for XIAO Retention Clip (Logo or no Logo)
  • 1x XIAO Retention Clip (Letters or no Letters)
  • 1x Eyetrack VR Prototype XIAO Mount.stl
  • 1x FacialTrackerBeerHingeVIDEVFrunk1.stl
  • 1x FacialTrackerBridgeVIDEVFrunk1.stl
  • 1x ValveIndexDEVFrunk1.stl
    • Alternatively, if you are interested in routing the ribbon cables internally, this modified DEV Frunk has holes at the top for sliding through ribbon cables and power cables. Example

Drawing1

Assembly Details:

1. Heat Inserts

  1. Start with inserting your heat inserts to the designated holes on the frunk. Place the heat inserts so the side with the smaller radius sits in the hole
  • When heating up the inserts, let the weight of the soldering iron do the work, they make take a few seconds to start moving on their own.
    • Do not push down, and remove the iron when they are level with the print. It doesn't need to be perfect, so take your time!

FRUNK

Bottom picture courtesy of amoistman

  1. Next, place a heat insert into the bigger hole of the facial tracker bridge

BRIDGE

  1. Lastly, place another heat insert into the thick side of the beer hinge

HINGE

  1. Your finished heat inserts should look like this

INSERTSEXAMPLE

Picture courtesy of amoistman

2. Screws

Now you're done with the hardest part! We can move onto screwing in the XIAO Mount and Facial tracker bridge onto the frunk, into the heat inserts we just inserted.

  • The XIAO mount should be placed so the longer side is not blocking the fan holes
  • If you have trouble screwing them in, try screwing them equally to distribute the pressure more evenly.
    • Screw one a little, screw the opposite the same amount, repeat.

M4SCREWGUIDE

Two last things to screw in will be the beer hinge and vive facial tracker or babble case if you have it

  1. Place the smaller hole of the beer hinge inbetween the bridge's screw holes.
    • Screwing in opposite of where the heat insert is. You should be screwing into the heat insert

M4SCREWGUIDE2

  1. For the last screw, repeat the same steps, but line up your face tracker or babble case. Screw from the same direction, into the heat insert of the beer hinge. Use either an M4x10 or an M4x30. I find the M4x10 sometimes loses tension, while the M4x30 doesnt.

  2. Your final result should look like this FINAL

3. Mounting components

  1. You can now push the XIAO's into the mount, making sure they are down snug, if you haven't already.

XIAO

  1. Place your V4 LED kit PCB into the middle of the XIAO Retention Clip. Ensure the hole in the PCB alligns with the small bump on the retention clip. You will have to slide it in and push it under the overhangs until they snap over the PCB.

XIAO-V4

  1. Route your V4 LED connectors through the coverplate holes prior to putting it on

V4POWER

  1. Place the coverplate ontop of the LED PCB until the front of it snaps down and locks it over the retention clip.

  2. Then you can slide the retention clip over your XIAO mount to keep them in place.

4. Replacing frunk

PLEASE UNPLUG YOUR INDEX FROM POWER BEFORE CONTINUING

You must have a T5 Torx screwdriver to remove the original index frunk screws, we will use the same screws to hold the dev frunk in place.

Picture courtesy of iFixit

T5SCREWS

  1. If you do not have fans, and are mounting a USB Hub infront, I recommend now plugging in the USB hub through the fan holes, as you won't be able to after dev frunk is screwed on.

Alternatively, you can use the USB C port below the middle bottom screw, and passthrough a hub with a FEMALE USB-C/A to MALE USB A cable

  1. Once the frunk is removed, line up your DEV Frunk and screw it back in the same way. Support it in a way you can screw in without it falling.

T5SCREWS2

Physics-Dude's Gumstick USB Hub Dongle

newnewnewnwe

Mod Details

Physics-Dude's Gumstick Dongle provides a way to cleanly install EyeTrackVR, all fitting into the frunk. It will require significantly more tinkering and ability to solder.

Installation instructions and BOM available on Physics-Dude's Repo

Released under the MIT License.

- + \ No newline at end of file diff --git a/headset_guides/what_is_this.html b/headset_guides/what_is_this.html index 26c6022..4611139 100644 --- a/headset_guides/what_is_this.html +++ b/headset_guides/what_is_this.html @@ -18,7 +18,7 @@
Skip to content

What are "Headset Specific Guides"?

This is an initiative to provide specific information on specific headsets. Any weird quirks, recommended 3d printed mounts and different setup types are to be included on a page.

Complete submissions will receive a 50% off coupon for the ETVR store. Partial completions (one specific mount) will receive 15% off coupons to show our appreciation and provide an incentive. (reach out to me in dms to receive the code @prohurtz)

Thank you for working to improve EyeTrackVR's accessibility!

Guidelines

Please keep it as concise as possible, and spell check before submitting. You can submit by making a PR on github, or sending the .md file to #documentation in the ETVR Discord pinging @prohurtz . Take a look at the Valve Index guide for inspiration.

file name: headset_name.md ex. HTC_Vive_Pro_2.md, Bigscreen_Beyond.md

Page formatting and template:

Headset Name

Xyz's mod:

[image(s) of mod (limit ~2)]

Mod Details

This mod supports lens inserts and is very compact and easy to use. It provides good frunk ventilation and wide hardware support.

Extra Parts Needed:

  • 2x Camera extensions
  • 2x Camera extension connectors

Parts Link (If the parts needed are not on the additional parts page, let me know. You can also just directly link to them.)

3D Printed Mounts

Found at Xyz's GitHub

You will need to print:

  • 2x camera mount
  • 2x frunk mount

Lens insert support Yes/No: Yes Compatable mount

Note

There are 160 and 130-degree variants for camera mounts; pick which one corresponds with your cameras.

Assembly Details:

(make sure to include quirks of setup not covered in other parts of documentation)

First, screw an M2 screw into the hole on part A:

[image showing "part A" and screw]

You must have a Torx screwdriver set to remove the frunk screw, do so now.

[image]

Then attach part b and tighten:

[image]

Now clip on [image]

Abc's mod:

[image(s) of mod (limit ~2)]

Mod Details

This mod supports lens inserts and is very compact and easy to use. It provides good frunk ventilation and wide hardware support.

Extra Parts Needed:

  • 2x Camera extensions
  • 2x Camera extension connectors
  • 50 M3 screws Parts Link (If the parts needed are not on the additional parts page, let me know. You can also just directly link to them.)

3D Printed Mounts

Found at Abc's GitHub

You will need to print:

  • 2x camera mount
  • 2x frunk mount

Lens insert support Yes/No: Yes Compatable mount

Note

There are 160 and 130-degree variants for camera mounts; pick which one corresponds with your cameras.

Assembly Details:

(make sure to include quirks of setup not covered in other parts of documentation)

First, screw an M2 screw into the hole on part A:

[image showing "part A" and screw]

You must have a Torx screwdriver set to remove the frunk screw, do so now.

[image]

Then attach part b and tighten:

[image]

Now clip on [image]

Released under the MIT License.

- + \ No newline at end of file diff --git a/how_to_build/3d_printed_mounts.html b/how_to_build/3d_printed_mounts.html index c2348e2..d02c7dc 100644 --- a/how_to_build/3d_printed_mounts.html +++ b/how_to_build/3d_printed_mounts.html @@ -21,7 +21,7 @@
Skip to content

3D Printed Mounts

Mounts with a next to them are the recommended mounts for the respected hmd. This is based on user feedback on what works best for most people.

NOTE

Keep in mind, some mounts only have the right(or left) version available. You may need to use blender or a slicer to mirror the STL for the other side.

Dont see your headset?

There may be mounts in the Discord that have not been added here. Check out the #community-mounts fourm for additional mounts here.

Released under the MIT License.

- + \ No newline at end of file diff --git a/how_to_build/creating_your_own_mount.html b/how_to_build/creating_your_own_mount.html index a76e3ce..d13b821 100644 --- a/how_to_build/creating_your_own_mount.html +++ b/how_to_build/creating_your_own_mount.html @@ -18,7 +18,7 @@
Skip to content

Creating Your Own Mount

This page will include a basic rundown on how to create your own camera mount for a headset that may not have any existing community mounts

So, you have a headset that does not have any mounts or none that work well for you. Here I will show some simple steps to get some form of a mount working.

Step 1: Find a way to mount to your headset's lens

The easiest way to get started is to find a lens protector for your headset. You can visit a website like Thingiverse or do a search for "3d printed lens protector for {your headset here}"

Example: Here is a lens protector for the Quest 1/2 and Rift S https://www.thingiverse.com/thing:3653631

Step 2: Find a camera mount

Now, you need a way to attach the camera. We have a basic design created by qdot, based on his mount on the hardware GitHub here

Alternatively, you can create your own mount if you have the skills.

Step 3: Place the camera mount where it will give a good camera angle

The "Ideal" location is a place that gives a good view of the pupil when looking to all extremes, the lower corners do a good job at getting there, or nearly there.

It is recommended to place the camera as close as possible to the headset's lens to maximize the view of the eye. I recommend looking at other headset mounts for inspiration.

Step 4: Mount your LEDs

You can either add parts to your mount for holding the LED boards or just glue them on. It's up to you and what works best for you, experiment!

Step 5: Test everything

It is unlikely you will get a perfect mount on the first try. Print, test, adjust, and repeat until you have a satisfactory mount. Good luck!

Step 6: Get your mounts listed

Send me the mount to be added to the docs here! You can upload it to Thingiverse or similar services, or just send me the .STLs via discord (Prohurtz#0001)

Released under the MIT License.

- + \ No newline at end of file diff --git a/how_to_build/full_build.html b/how_to_build/full_build.html index 6f9043d..1a11768 100644 --- a/how_to_build/full_build.html +++ b/how_to_build/full_build.html @@ -21,7 +21,7 @@
Skip to content

V3 Build Guide

This page will contain a step-by-step assembly guide.

Updated documentation for V4 is still in progress and not updated here, please be patient.

This guide provides a walkthrough on the assembly of a wireless V3 eye tracking. V4 specific final docs are Work In Progress: V4 LED Solderless assemblly rough edit:

Sketch of V4 solderless wiring:

Sketch of V4 wiring

Credit: @balty via Discord

Sketch of solder assembly of V4:

Sketch of V4 wire

Nevertheless, it's worth noting that these instructions are also mostly applicable to those who are using wired or V4 trackers.

Step 1: Make sure your have read the Things to know before you start guide

This will give you a basic overview of the project's status and what to expect currently.

Step 2: Order all the parts listed on our Parts list

Please take note of the fact that hardware still may change, although with more developments it seems like we are going to stick with current hardware.

Step 3: Wait for things to arrive

Long shipping times from China are f u n. Please allow anywhere from 2 weeks to 2 months for everything to arrive.

Step 4: Gather up all of your hardware

Make sure you have at least the following:

'img of components'

'ESPs, cams, a programmer and a USB connector'

Step 5: Install external antennas or shield ESP antenna with an antistatic bag

Some ESP-CAM boards have issues with signal integrity, there are 2 things you can do to help/solve the issues.

The first option is to use an external antenna.

This is the best solution when it comes to the final result. If you have Vive/Tundra trackers, this is a REQUIRED step. The interference from the trackers will make your ESP stream unusable. An antistatic bag does not help in this case. Unfortunately, removing the antenna is not super easy, you have to either move a resistor or, remove it and bridge 2 solder pads. The attached image below shows the orientation of the pads that need to be connected, depending on the mode You can not bridge all connections and have both antennas active at the same time. The 0-ohm resistor does not need to be on the board, you can simply bridge the connections.

Below is an example of bridging the connections and attaching an antenna.

'img of external antenna resistors'

The second option is to cover the ESP's antenna with an antistatic bag. This can help aid problems, and can completely solve them in some cases. Best of all, it is completely free! However, it should be noted that it performs worse than an external antenna and in certain cases will not solve the issue like if you have Vive trackers.

Step 6: Attach cameras to ESPs

Look at your ESP and locate the camera ribbon cable connector as circled below.

'img of camera socket'

Flip the gray part up to allow the cameras to be connected. Do not force it, or shove objects into it to open, fingernails are fine.

'img of camera clip'

Now slide in a camera, please note that the pins are facing down, you should only see the black part.

'img of camera cable'

Once the camera has been slid in, press the gray part of the connector back down. There will be a small amount of force but still be gentle. Note the ammount of black coming out of the connector.

'img of camera cable'

Step 7: Connect ESP to the programmer to flash

Why flash before you have it assembled? It's simple, to make sure they actually work before you spend time soldering to them.

Slide your ESP into the programmer, and note the USB port goes away from the ESP's camera.

Step 8: Configure Visual Studio Code and prepare to flash the firmware

Check out our guide on Setting up VS Code

Once VS Code is set up, move on to the next step.

Step 9: Plug in your ESP and flash the firmware

Our guide, Building and uploading the firmware manually has steps on how to do this. After it has flashed, make sure you get a video stream in your browser, then power it down and flash your next ESP.

Step 10: Connect your power wires to a USB Type-A board

WARNING

Powering from the programmer board will not work correctly. This delivers a lower voltage which results in dim LEDs and video artifacts. These are highly likely to mess up tracking.

Get two pairs of wire, preferably two different colors, Cut them to length (56mm in my case) and twist together two for ground and two for 5V. Here I used speaker wire where the copper denotes positive and silver negative. Then, strip the wires to about 3mm of exposed wire.

Step 11: Cut wires for IR LEDs

To find the optimal length, take a piece of wire and a marker and mock up your wire route, and mark the wire, cut it, then make another at the same size for the other eye. You will need 3 different cuts of wire. 2 short ones for connecting the 2 PCBs per eye together, 2 Longer ones for power, or ground and 2 slightly longer ones for power or ground for the LED near the camera at the bottom.

Once cut, strip them to around 4mm of exposed wire.

Step 12: Twist the positive USB wire and positive IR LED wires together and tin them

Once twisted together add solder to keep them together. This makes the connection much easier.

Step 13: Solder the positive wire to ESP

Lay the wire on the outside of the 5V pin and apply solder.

Step 14: Solder the negative wire to ESP

Repeat Step 12 but with the negative wires.

INFO

As a user in our discord has learned, you can short the IO12 pin with the ground pin (GND) without issues.

In the below example I put it on the top of the pin, It will be a weak-ish joint but that's where glue comes in handy.

Step 15: Wire up the 2nd ESP

Repeat steps 12-14 with the 2nd ESP.

Step 16: Prepare to solder IR LED PCBs

Get your magnifying glass out, it's time to solder very smol things.

Gather 4 PCBs, 4 IR LEDs, and either 4 ~350ohm.


357ohm resistors and V3 PCBs

Here are the PCB pin-out labels:


V3

LED labels:


The green markings and notched corners mark the positive sides of the LEDs pictured above.

If you have different LEDs, please consult their datasheet.

Some terminology related to them:

5V: 5-volt power in.

GND: Ground or power out.

AR: After-Resistor this is to be used as the power in on the 2nd PCB in series as resistors are not needed on the 2nd PCB since they are on the 1st one.

SNG: Single resistor, use this as 5V in if you are using only 1 ~700ohm resistor on V3 boards (not recommended).

Negative: This marks the negative side of the LED.

Positive: This marks the positive side of the LED.

Step 18: Solder resistors on PCB V3

You only need 1 PCB to have resistors per eye.

Tin the resistor pads. Note: in this example, I use too much solder, it should only be enough to lightly cover the pad.

Next, grab a resistor and hold it on the pads.

While holding the resistor add solder to your soldering iron and apply it to the resistor.

I like to do this by having a piece of my solder stick up in the air and then put it on my iron that way.

Flip the PCB and solder the other end.

Now repeat for the other one.

Solder LEDs on PCB V3

Tin the LED pads.

Orientate the LED and hold it in place.

Solder one end.

Flip around and solder the other end.

Wire up the PCBs V3

WARNING

Pay attention to the direction of the LEDs on the PCBs.

If the green dot is facing inwards toward the text like in the picture below:

Use the following diagram:

If the green dot is facing away from the text like the picture below:

Use the following diagram:

Step 19: 3D print mounts

Head to the 3D printed parts section of the parts list here.

Find which parts are for your headset and print them. Some may work better or worse, it is recommended to test all of them if there are multiple, print one of each kind. If none work, try making an edit yourself if you have the skills. If you have made a mount make sure to ping me, Prohurtz#0001, so I can add them to the list.

Having trouble getting them to fit? Try resizing the mounts up, or down a little to ensure a good fit.

There are 2 different types of mounts, how to secure the camera to each type will be documented below.

Type 1

This uses a method of sliding in the camera. Generally, this is the recommended mounting method as it generally requires no glue.

Place the camera into the mount

Slowly apply pressure inwards until the camera snaps into place.

NOTE

There is a good chance of breaking the mount when putting in the camera. If this happens you may be able to save the mount depending on where the break was. A small dab of hot glue around the camera is likely all that is needed.

Type 2

This method involves gluing the camera in place.

Apply a bit of glue to the bottom of the camera mount.

Place the camera on the mount.

IR LED mounting

This again differs from mount to mount.

In some cases, there are designated spots for the LEDs to go.

In others there are no specified spots, you will have to mess around to find what works best. This image shows the optimal/near-optimal position for the LEDs. Hot glue is your friend with this.

TIP

Use rubbing alcohol to easily remove hot glue.

Released under the MIT License.

- + \ No newline at end of file diff --git a/how_to_build/led_setup.html b/how_to_build/led_setup.html index 766a222..36feb47 100644 --- a/how_to_build/led_setup.html +++ b/how_to_build/led_setup.html @@ -18,7 +18,7 @@
Skip to content

Pardon the mess, this page is WIP more docs for other configurations and assemblly are in the works.

How to assemble a V4 mini LED hardware kit:

Getting to know your hardware

There are 3 main components of an IR LED kit; the mainboard, LEDs and wires.

Mainboard

Below is the V4 mini mainboard with major features labled. V4mini
V4mini_back

LEDs

There are 2 different types of LED boards. The one you should have more of are the "N" LEDs. N stands for Normal, these make up 3 of the leds per eye. nLED
NLEDback

The second type is the "E" LEDs. E stands for End as these are put at the end of the LED strand.

eLED
eLEDback

In future orders the "E" LEDs will be purple for easier distinction. eLEDpurple

Wires

The included wires are a bit special, they have 3 pins on the connectors but only 2 wires are attached. v4wire

This distinction is crutial in assembling a kit as the wires need to go in a specific orientation outlined in the assembly picture which shows the pins without wires as dashes - - -.

Wiring up V4 mini

Start by pluggin in the long wires to the main board like shown.

notpluggedr
pluggedinr

Then connect the wires to leds in the sequence:

Mainboard -> N LED -> N LED -> N LED - > E LED

You need to pay very close attention to the orientation of the wires so that the missing wire is facing the correct way. The following image shows a kit fully assembled:

v4minifull

Here is an example of the right eye's LED strand.

v4minireye

How to assemble V4 Lite LED kit:

Getting to know V4 lite

V4 lite is a soldering-required approach to LEDs. There are a few specific caveats to know before assembling and using this hardware.

  1. The board must be supplied with 5v.

    • Other voltages may result in darker/non functioning LEDs.
  2. You must change resistors if you change from a single or dual eye setup to the other.

    • Not doing so will result in too dim of LEDs or extremely bright LEDs. Ensure you are always using the correct one for your application.
  3. Shorts on the main board can result in damaged hardware, dangerously bright LEDs, or nonfunctioning hardware.

    • Please ensure there are no shorts (even very small stray solder strands) between any pins or solder joints on the main board or LEDs.

Wiring up V4 Lite

First, decide if you want dual eye or single eye operation and pick the appropiate resistor.

Single Eye:
Use the 130ohm resistor marked with Black, Black, and Gold middle rings:

v4litesingleeye

Dual Eye:
Use the 65ohm resistor marked with Yellow, White and Gold middle rings:

v4litedualeye

Now, solder the resistor on the board (any orientation), and then the black 3 pin voltage regulator (orientation matters, solder on the side with the white outline and have it fit in the outline's shape)

v4litemainboardassem

Now, wire up the LEDs like shown.

v4litefullassmb

Released under the MIT License.

- + \ No newline at end of file diff --git a/how_to_build/part_list.html b/how_to_build/part_list.html index cd270a0..4817940 100644 --- a/how_to_build/part_list.html +++ b/how_to_build/part_list.html @@ -18,7 +18,7 @@
Skip to content

WIP new parts list. 3D printed parts will live on their own page.

Use the interactive tables below to select parts!

Required Parts:

ComponentChoiceAmountCostCost AllLinks
ESP Microcontroller
★ Seeed Studio XIAO Sense ESP32-S3
2$12.87~$28.92AliExpress XIAO Sense Recommended
Camera
★ 130 Degree NO-IR
2$5.41~$10.82AliExpress Select "Color: 75MM-130 Degree" Must manually remove IR filter
IR LEDs
★ Official V4 mini No-Solder
1$30.00~$37.00ETVR Store Length depending on HMD
USB Hub
★ 4 port USB 3.2 LDLrui (MTT Usually) (Best for Cost)
1$17.99~$19.99Amazon Usually these are MTT, but sometimes not
3D Printed Mounts
Print Yourself
000

Total: $0.00

Additional Parts:

ComponentChoiceAmountCostCost AllLinks
Type A USB Breakout
Bring Your Own
000
USB-C Breakout
Bring Your Own
000
Wire for ESPs
Bring Your Own
000
Wire for LEDs
Bring Your Own
000
Camera Extension Cables
Bring Your Own
000
Camera Extension Connectors
Bring Your Own
000
External Antennas
Bring Your Own
000
USB-C Cables
Bring Your Own
000
V4 LED Wire Extensions/Replacements
Bring Your Own
000

Total: $0.00

Table modified greatly from the table on SlimeVR's Docs which was created by Carl

ESP setup options:

DANGER

There are two primary categories of trackers supported: wireless and wired. It's super important that you really look at all your options and consider your use case before making a decision.

Option 1: Wired over USB Serial

Wired Capable ESPs:

  • Seeed Studio XIAO Sense
  • FREENOVE ESP32-S3-WROOM CAM Board

This is our latest setup recommended for users with wired headsets such as the Valve Index.

Pros:

  • Much better performance and higher framerate up to 70 FPS with lower latency
  • No conflicts with advanced FBT setups such as Vive or Tundra Trackers
  • Can be wireless or wired
  • Less soldering required compared to ESP32-CAM boards

Cons:

  • Requires USB port i.e. on Valve Index and USB hub mounted on your headset
  • Can be slightly more costly
  • When in use with a Vive Facial Tracker or other bandwidth sensitive components, a MTT USB hub is required or the devices will be unusable (low FPS).
  • Requires beta app versions until the v2.0 app is released

Option 2: Wireless over WiFi 2.4 GHz

Wireless Capable ESPs:

  • Seeed Studio XIAO Sense
  • ESP32-CAM
  • FREENOVE ESP32-S3-WROOM CAM Board

This offers a good starting point as the ESP32-CAM boards are cheaper than wired capable trackers. However, they can be a bit more temperamental and if you are using a wired headset, they don't really make sense.

Pros:

  • Less cables (power only)
  • No issues with USB hubs
  • Easier to flash

Cons:

  • Requires two external antennas for optimal streaming quality
  • Requires WiFi 2.4 GHz router or access point in reasonable proximity
  • High risk of radio interference with FBT and other WiFi 2.4 GHz devices
  • Uses more power and heats up due to radio power needs

Released under the MIT License.

- + \ No newline at end of file diff --git a/how_to_build/parts_list.html b/how_to_build/parts_list.html index 0a7ab0a..db0df46 100644 --- a/how_to_build/parts_list.html +++ b/how_to_build/parts_list.html @@ -21,7 +21,7 @@
Skip to content

Parts List

CAUTION

Please note that no hardware has been fully set in stone, all purchases are at your own risk in case of hardware changes.

NOTE

It is good practice to buy more than needed in some cases, namely cameras,programmers and ESPs. This reduces the risk of a DOA (dead on arrival) causing a delay.

ESP setup options:

DANGER

There are two primary categories of trackers supported: wireless and wired. It's super important that you really look at all your options and consider your use case before making a decision.

Option 1: Wireless over WiFi 2.4 GHz

This offers a good starting point as the ESP boards are cheaper than wired capable trackers. However, they can be a bit more temperamental and if you are using a wired headset, they start to make less sense.

Pros:

  • Wider user adoption and greater support
  • Low weight, less cables, easier to manage

Cons:

  • Requires two external antennas for optimal streaming quality
  • Requires WiFi 2.4 GHz router or access point in reasonable proximity
  • Risk of radio interference with FBT and other WiFi 2.4 GHz devices
  • ESP32-CAM uses more power and heats up due to radio module power needs

Option 2: Wired over USB Serial (in beta)

This is our latest setup recommended for users with wired headsets such as the Valve Index.

Pros:

  • Much better performance and higher framerates up to 70 FPS with lower latency
  • No conflicts with advanced FBT setups such as Vive or Tundra Trackers
  • Can be wireless or wired
  • Less soldering required

Cons:

  • Requires USB port i.e. on Valve Index and USB hub mounted on your headset
  • Can be more costly

Required parts

With that in mind, here are the required parts for the setup.

Tracker boards:

    • 2x XIAO ESP32-S3 Sense Modules
      Very small size, wireless and wired support, does not require an additional programmer or separate antennas: SeedStudio

    AliExpress

    Alternatively:

    • 2x Freenove ESP 32-S3 WROOM
      Much larger, price may be higher than the XIAOs, supports wired and wireless as well, does not need a separate programmer, does not support a separate antenna. Amazon
  • A cheaper, wireless only alternative:

    • 2x ESPCAM32 Modules - They are cheaper, requires a separate programmer board, supports only wireless streaming and may require some soldering for attaching separate antennas: AliExpress
      This board also requires a special programmer board in order to flash the firmware. You'll need only one: AliExpress for just programmers or Amazon for 3 ESP32-Cams and programmers without the proper cameras

    • 2x External antennas for ESPs Required if you're running vive/tundra full body tracking or having issues with streaming. See the full build guide on how to configure the ESP board for antenna use.

Cameras

Camera modules:

The cameras that come with the boards will not work, they have an IR filter inside.

The filter can be removed, but doing so may break your cameras. Here's how to do it: https://www.youtube.com/watch?v=QYH-FWvDbDc

DANGER

If you decide to remove the filter, wear eye protection while doing so, the removal of the filter will shatter it.

  • 2x OV2640 160° FOV IR / Night vers 75mm (850nm)

(select 75MM-160 850nm) AliExpress

Amazon alternative (note that you will have to manually remove the IR filter)

Things worth noting:

  • There is an experimental option to purchase OV2640 130° FOV cameras and manually remove the IR filters. This can result in a clearer view of the eye and is smaller so it allows for HMDs like the Bigscreen Beyond, but it's not a default recommendation just yet. Checkout the listing "75MM-130 Degree" AliEXpress

  • If the cameras are Dead on Arrival (DOA), keep in mind that you can swap the lenses with the ones that come with the stock cameras that were included with the ESP32 boards, though they do require extension cables. I recommend just buying more (3-4 instead of just 2).

  • Optional Ribbon extensions:

Certain mounts may require a cable extension (ex: MUTE's frunk mod) To use a mount like that you will need a cable and a connector for each camera. Headsets like the Quest 2 do not require extensions if the camera boards are mounted at the bottom of the headset, near the facial interface.

Extensions (select 0.5mm, A-Forward Direction, 24P) AliExpress (200mm for MUTE's)

Connectors (select 24P) Aliexpress

USB Hubs - for if you are building the wired version

  • 1x USB 3.0+ hub The hub needs to support MTT if you wish to use a vive face tracker with the ESPs. Here is a hub that supports MTT and is not expensive Amazon

Ensure that it is of good quality, lightweight, compact in size, and I suggest a minimum of four ports - one for the face tracker, one for the LED kit, and two for ESP32-S3 boards. Ensure that you acquire appropriate USB-A to USB-C adapters wherever required. For example the Valve Index only has one USB-A port. Be wary that "mini" adapters with the short ends may not work properly. Ensure the adapter has a long end. Such adapter Amazon.

  • 2x USB C cables (as short as possible) to connect both ESPs (select 0.05m/5cm) AliExpress or AliExpress USB C to USB A As the ESP32-S3 has USB-C connectors, you may require some USB-C to USB-A cables depending on your USB hub. It's worth noting that the HTC Face Tracker requires a USB-C port, so you may need to consider this requirement as well. You may also make custom cables using USB breakout boards of your choice and wire.

Wires and adapters

If you are soldering and using custom cables:

IR emitters

We recommend getting the official kits:

Alternatively you can source the parts for V3 yourself:

DANGER

If they look like something you would find in a TV remote, do NOT use them. Even if you think you can alter them THEY WILL NOT WORK AND PUT YOU AT RISK. If you aren't exactly sure what you are doing, buy them from the LCSC or Digikey link.

DO NOT BUY FOCUSED ONES!

  • 4x Unfocused SMD IR emitters

    NOTE

    The smaller ones can not be soldered at temps above 245C or they will burn. Low temp solder is recommended.

  • 4x IR emitter PCBs (highly recommended) Gerber files and schematics located here.

  • 4x 350ohm 1206 SMD resistors for IR emitters (If you are not using PCBs for the emitters then buying regular through-hole resistors is acceptable)

  • 357 ohm Digikey here

  • 348 ohm LCSC here

This ensures you get the correct IR emitter parts. If you are a Patreon please check out discount codes available to you, and also check out unit pricing.

3D Printed Mounts

Mounts with a next to them are the recommended mounts for the respected hmd. This is based on user feedback on what works best for most people.

NOTE

You'll need both - a mount for the IR emitters and a mount for the camera boards, some files include both, some don't.

Misc parts/tools

AliExpress (Generic)
Amazon (Generic)

  • 2x-4x Heatsinks (Optional) - for the ESP32-CAM AliExpress (14mm for ESP, 8mm for voltage regulator)

Other Headsets

If you own another headset not listed above, that means there are no mounts designed for them yet. If you have basic skills in modeling or think up a solution to mount cams and emitters, please try to make a mount and then let us in the discord know so it can be added here. Any headset that can fit the camera is potentially compatible. If you are willing, give it a shot to design a mount for the rest of the community.

Check out our basic guide on making your own mount here.

TIP

If you have a headset mount that is not listed above, please let us know in the discord so it can be added here.

Released under the MIT License.

- + \ No newline at end of file diff --git a/how_to_build/preparing_cameras.html b/how_to_build/preparing_cameras.html index 64806f7..5d41bf3 100644 --- a/how_to_build/preparing_cameras.html +++ b/how_to_build/preparing_cameras.html @@ -19,7 +19,7 @@
Skip to content

IR Filter Removal

Some cameras require a removal of the IR filter, so the sensors are able to pick up the lighting provided by the IR LED's. Many times, the IR filter will still be on the 160° night vision model. If your feed looks dark, this may be why!

130° IR Filter removal

160° IR Filter removal

Blurry Camera?

If your feed looks blurry in the ETVR app, you likely need to focus the cameras by turning the lens. They are glued in place from factory, but easily breakable with pliers.

Use a point of reference like the LED's while holding the camera, rotate the lens with pliers in either direction and watch if the feed gets more or less blurry. Repeat until you dial it in nicely. You should do this prior to mounting the cameras or atleast prevent mounting and unmounting them too many times or you risk breaking them.

Protecting a Camera Ribbon Cable

The ribbon cables that these cameras use are notoriously prone to damage rendering them useless.

By wrapping them in tape such as electrical tape and following best practices covered in the guide, you can significantly reduce the risk of killing a camera.

It is recommended to do this before building your setup so you are less likely to kill a camera in the process of building a setup.

Wrapping the Camera Ribbon

First get a roll of electrical tape to wrap the camera ribbon. It does not strictly need to be electrical tape, but that is what I have found to work good, (and look good too).

Place the camera on the tape so that the entire bottom area including the sensor are covered and that one side of the camera ribbon has slightly more tape (helps make it look good).

Cut the tape from the roll, here I used flat cutters. Be careful to not cut the camera connector in the process.

With the tape cut from the roll, lay it down and get out a X-ACTO knife.

Begin to cut around the camera connector so the tape can be peeled off.

When each side has been cut, begin to peel off the part that covered the connections.

Now, carefully cut around the camera sensor part to remove its "skirt" leaving tape on the back of it.

Gently pull off this outline of tape from the camera.

Begin to wrap the tape along the ribbon cable by first folding in the slightly shorter side.

Fold over the other side.

And you are done!

Best Practices When Handling Cameras

  • Do not pull or jerk on the ribbon
  • Do not fold tightly or bend the ribbon cable sharper than ~45 degrees if possible
  • Do not expose to sharp objects or crevices
  • The less wear while handling or putting on/off a headset the better
  • Do not unmount and mount the cameras repetitively.

Conclusion

You have successfully wrapped your camera to be protected and learned the best practices with handling cameras!

Released under the MIT License.

- + \ No newline at end of file diff --git a/how_to_build/preparing_xiao.html b/how_to_build/preparing_xiao.html index d0c0868..de4779b 100644 --- a/how_to_build/preparing_xiao.html +++ b/how_to_build/preparing_xiao.html @@ -19,7 +19,7 @@
Skip to content

How to prepare a Seeed Studio XIAO Sense ESP32

What's in the box

Your XIAO should come with the following components:

  • ESP32-S3 Main Board
  • Camera Hat With Low FOV Camera
  • External Antenna
'img of xiao esp'

Seeed Studio XIAO Sense ESP32 Contents

Wired vs Wireless

The only difference in hardware config between wireless and wired XIAOs is if the antenna is connected or not.

Wireless Configuration

If you are making a wireless setup, you will need to connect the antenna.

Line up the antenna connector to the one on the board,

'img of xiao esp'

then press firmly until it snaps on.

'img of xiao esp'

Once connected it should be flat like this:

'img of xiao esp'

Wired Configuration

If you are making a wired setup, you do not need to attach the antenna. Set it aside and continue to the next step.

Connecting the Camera Hat

First, locate the connector on the camera hat, and the corresponding one on the XIAO main board circled below.

'img of xiao esp'

Press down carefully until it snaps on.

'img of xiao esp'
'img of xiao esp'

Once connected remember to be careful, it is known that these connectors can break semi-easily so avoid ripping them apart too many times or twisting on them weirdly.

Removing the Camera

First we need to remove the existing camera from the XIAO and replace it with a higher FOV longer ribbon camera.

Begin by lifting up the grey part of the camera connector gently until it raises up.

'img of xiao esp'
'img of xiao esp'

Now grab the camera and gently wiggle it out of the connector.

'img of xiao esp'
'img of xiao esp'

Connecting the camera

The camera should have the pins facing downward, you should only see the black end.

Line up the camera with the pins on the connector like below.

'img of xiao esp'

Push the camera in by using a finger on each side of the connector, slowly pushing straight in until it stops. Be gentle so you do not damage the ribbon cable.

'img of xiao esp'

The cable should go about half way in like this image:

'img of xiao esp'

Now, close the camera connector by flipping the grey part down.

'img of xiao esp'
'img of xiao esp'

Conclusion

You should now have your XIAO Sense ESP32 ready for firmware flashing!

Released under the MIT License.

- + \ No newline at end of file diff --git a/how_to_build/v4_full_build.html b/how_to_build/v4_full_build.html index b8e1b08..a5e393f 100644 --- a/how_to_build/v4_full_build.html +++ b/how_to_build/v4_full_build.html @@ -24,7 +24,7 @@ 1 × USB hub 1 × Set of 3D printed mounts

Prepare you hardware

Follow the docs pages for preparing the LEDs, cameras and XIAO ESPs.

Flash the OpenIris firmware onto the ESPs.

Prepare and assemble the 3d printed mounts

Clip the LEDs into the 3d printed mounts, routing the wires up to the top of the headset.

Attach the cameras to the mount carefully. If you are using 130 degree cameras you might need to hotglue them depending on the mount. Glue the plastic housing and not the lens or back of the camera (it will melt).

Now, carefully attach the mounts to your lens and route the camera cables down and out, and the LED wires up and through the top of the headset gasket.

Now, attach the ESPs to your headset by hot-gluing a mount or whatever other mount they use.

Attach your headset gasket carefully now.

Wiring it up

Now, connect your USB hub to your headset and connect the ESPs and LED board with USB-C cables.

Attach any ESP antennas now to your headset.

Cable manage!

Software

At this point you should have all of your hardware and firmware ready so it is time to set-up the tracking app.

Wired

On Windows, you will need to open Device manager and find the COM port. (you may have this from the firmware flashing step). Under Serial Devices (Com) there will be a list of devices. You can unplug an ESP and see which one disconnects which will be the port of the ESP. Enter this into the app under the cooresponding eye (right or left) like COM4.

Wireless

When flashing the firmware you will have set a MDNS address like ETVR-left.local. Enter this into the app under its corresponding eye.

- + \ No newline at end of file diff --git a/index.html b/index.html index 58ffc16..66363de 100644 --- a/index.html +++ b/index.html @@ -19,7 +19,7 @@
Skip to content

EyeTrackVR Docs

Source First and affordable VR eye tracking.

For Social VR Games via OSC and UDP protocol.

ETVR logo

Released under the MIT License.

- + \ No newline at end of file diff --git a/misc/faq.html b/misc/faq.html index 45b1e36..0d09d54 100644 --- a/misc/faq.html +++ b/misc/faq.html @@ -12,14 +12,14 @@ - + -
Skip to content

Frequently Asked Questions



Released under the MIT License.

- +
Skip to content

Frequently Asked Questions



Released under the MIT License.

+ \ No newline at end of file diff --git a/misc/jlc3dp.html b/misc/jlc3dp.html index 85a539f..e68c907 100644 --- a/misc/jlc3dp.html +++ b/misc/jlc3dp.html @@ -18,7 +18,7 @@
Skip to content

3D Printing Service JLC3DP

If you don't own a 3D Printer, you can order high quality prints by the dollar using JLC3DP

https://jlc3dp.com/3d-printing-quote

Upload your STL files with the following print settings

Do note that if you are printing for a example, a lens mount, you may need to mirror it in blender, or any slicer, to get both a left and right STL and upload them separate. Do not upload them combined

UploadJLC

Choose a cheaper shipping option, for me this is global standard direct line

Shipping

In the likely event you get an email or alert from JLC asking for consent to print as the STL has thin walls, you can agree to the print without worry.

Released under the MIT License.

- + \ No newline at end of file diff --git a/misc/vrc_avatar_setup.html b/misc/vrc_avatar_setup.html index ccba42d..03f7346 100644 --- a/misc/vrc_avatar_setup.html +++ b/misc/vrc_avatar_setup.html @@ -19,7 +19,7 @@
Skip to content

How to set up your avatar for eye tracking:

Here is an avatar setup video created by a contributor:

Another good resource is the VRC Face Tracking Wiki and their Discord.

We currently use the following paramaters

LeftEyeX

RightEyeX

EyesY

LeftEyeLidExpandedSqueeze

RightEyeLidExpandedSqueeze

NOTE

We currently only support float parameters, please adjust your setup accordingly.

Released under the MIT License.

- + \ No newline at end of file diff --git a/software_guide/VRCFT_tracking_module.html b/software_guide/VRCFT_tracking_module.html index cb5331a..c56d3e0 100644 --- a/software_guide/VRCFT_tracking_module.html +++ b/software_guide/VRCFT_tracking_module.html @@ -21,7 +21,7 @@
Skip to content

VRCFT ETVR Tracking module, or simply the ETVR tracking module

What is this?

VRCFT ETVR Tracking module is an extension for VRCFT. It allows ETVR to send out tracking data and have it converted to parameters expected by VRCFT and avatars compatible with VRCFT.

In effect, ETVR will be compatible with whichever game or program that supports VRCFT, thanks to this module.

It is required for all avatrs that are setup to use Unified Expression(UE) or V2 parameters, optional for ones setup with so-called legacy or v1 parameters.

What do I need and how do I set this up?

To make ETVR work with VRCFT, you'll need two things. VRCFT itself, and the tracking module.

Required programs / Files

VRCFT

To get VRCFT head over to: their docs page!

ETVR Module

Once you've installed VRCFT, you can find the most recent version of the module in the registry.

Screenshot showcasing the VRCFT module registry, with ETVR Tracking Module selected

Clicking on install will download and load the module, VRCFT will now be setup to listen for ETVR data. That's it!

Setting up ETVR to use the module

Screenshot showcasing the settings page of ETVR, explaining how to setup the VRCFT module
  • Open settings tab
  • Under OSC section select Use the ETVR Module
  • Select the desired parameters - V1 or V2 (UE)
  • Go back into the tracking tab

Settings will apply automatically, no need to change port or restart the app. ETVR will automatically start sending the data to the module.

Sending the tracking data to VRCFT running on a separate PC.

Screenshot showcasing the settings page of ETVR, explaining how to setup ETVR to talk to a different PC

If you're running a hybrid setup where one machine does all of the ETVR tracking, and the other has VRCFT and other software running, you'll need to adjust the settings a bit.

First, make sure that both machines are on the same network and they can see each other. If they're connected via Ethernet to the same router, or to the same WiFi network, things should just work.

Next, find out that's the IP address of the machine running VRCFT. For how to, refer to Microsoft's documentation

With that, head over to ETVR, select the VRCFT Module Settings tab. Notice the VRCFT Module listening IP section. By default it's set to 127.0.0.1, change it to the IP the PC that runs VRCFT.

Reseting the module to default settings in case something went wrong.

Screenshot showcasing the settings page of ETVR, explaining how to reset the ETVR Module

This is also taken care of by the app, simply make sure that VRCFT is running and the ETVR module got loaded in.

Then, in ETVR head back over to VRCFT Module Settings tab and hit Reset Settings to default.

This will reset the module's settings to default values for both - ETVR and VRCFT.

There's a breaking change reported, what does that mean? What do I do?

When we say there's a breaking change, we mean that something changed so drastically that there's a high chance of the module failing to load. This is a very rare thing, usually happens when settings get modified so much that they're no longer compatible.

In such cases, deleting the module's settings file is enough. Check the FAQ section - Where's the module's configuration file? For locating and deleting it.

Dev version of the module for testing

Sometimes, we'll have a new version of the module ready for test, or a special one with a lot of additional logging to help us debug some stuff on your end. This version won't be available in the registry right away so how you install it?

  1. Uninstall any pre-existing EVT Modules:

Uninstall your current ETVR Module installation.

Similarly to installing it for the first time, navigate to VRCFT's module registry, find the ETVR Module and hit uninstall.

  1. Download the latest build:

Download the latest release / pre-release from the repo or from the discord.

  1. Install the development version:

Navigate to:

C:\Users\<Your PC name>\AppData\Roaming\VRCFaceTracking\CustomLibs

And place the module there. If CustomLibs doesn't exist yet, create it. VRCFT will use it to load the development version of the module next time you launch the app.

Building from source

Download the ETVR Module source from github

First, just like you'd do with the apps source, download it from github. Either via downloading and extracting a zip or by using git clone.

Donwload the latest VRCFT Source

Download the VRCFT Source from their github. It's important to put next to the ETVR module. Your directory should look more or less like this:

parent_directory
     - ETVR Module
     - VRCFT Repo

Building the module requires some of the VRCFT stuff to be present, namely the Core directory. To make it easier to compile, we've set the project up so that it will look for it, next to its own directory.

Install Visual Studio or Jetbrains Rider, and open the project

Whichever IDE you prefer, both just work, both make it easier to work with the source code.

Once installed, open the project in them and wait for everything to load.

Building the module

That's the last step, now we just need to verify that everything is setup correctly. Click on build, or hit ctrl+b and watch the output.

It should build just fine and the resulting file should get copied to the VRCFT CustomModules` directory.

Note, the copying will fail when VRCFT is running. If it does, close VRCFT and hit build again, this time only the copy step will be executed

FAQ

Where's the module's configuration file?

In the rare cases where you need to modify the configuration file, here's how to find it:

Since CRVFT is an UWP app and those behave quite a bit differently than regular apps, there's no easy way to access that file. Instead, you'll need to search on your C:/ drive for 'ETVRModuleConfig.json and it should show up.

WHat do the fields in this file mean?

Range explanation: 0 - fully closed, 1 - fully open.

  • OutputMultiplier: Defines by how much the output should be multiplied, it helps with making your tracking less or more expressive. 1 by default, [0 - 2] range.

  • PortNumber: the port under which the module will be listening for OSC messages, by default 8889

  • ShouldEmulateEyeWiden: Toggle for emulating eye widen, think surprised face. On by default, will at 0.97 of fully open eye

  • ShouldEmulateEyeSquint: Toggle for emulating squinting, think shutting your eyes with a bit of force. On by default, will activate at 0.05 of eye openness.

  • ShouldEmulateEyebrows: Togggle for emulating eyebrow movement, depending on eye openness, the module will try and move the avis eye brows a little. On by default at 0.89, range [0 - 1]

  • EyebrowThresholdRising: Defines when eyebrow emulation should kick in, by default at 0.89, range [0 - 1]

  • EyebrowThresholdLowering: Defines when eyebrow emulation should kick in but in the opposite direction, by default at 0.06, range [0 - 1]

  • SqueezeThresholdV1 - Defines when squeeze emulation should begin, and how "fast" should it be. By default, [0.06, 0.51], with range first: [0 - 1], second: [0 - 2]. This is used for v1 or legacy parameters

-SqueezeThresholdV2 - Defines when squeeze emulation should begin, and how "fast" should it be. By default, [0.06, -0.99], with range first: [0 - 1], second: [-2 - 0]. This is used for v2 or Unified Expressions parameters

  • WidenThresholdV1 - Defines when Widen emulation should begin, and how "fast" should it be. By default, [0.93, 1.01], with range first: [0 - 1], second: [0 - 2]. This is used for v1 or legacy parameters

  • WidenThresholdV2 - Defines when widen emulation should begin, and how "fast" should it be. By default, [0.96, 1.04], with range first: [0 - 1], second: [0 - 2]. This is used for v2 or Unified Expressions parameters

To better explain how Widen / Squeeze emulation thresholds work: They're implemented via a simple smoothstep, you can play around with it here, simply plug in the values in smoothstep() and watch how the graph reacts.

Released under the MIT License.

- + \ No newline at end of file diff --git a/software_guide/build_software.html b/software_guide/build_software.html index 7cb1064..25b647c 100644 --- a/software_guide/build_software.html +++ b/software_guide/build_software.html @@ -20,7 +20,7 @@
Skip to content

Build the app from source

This guide will show how to build the app from source

NOTE

This is NOT a required step, you do not need to build the app from source.

Requirements

Install Python

EyeTrackVR is currently using Python 3.11.0 Before you continue, please install it.

Installing Poetry

Starting with version 0.1.7, EyeTrackVR uses Poetry to manage app dependencies. To build the app, you must first install Poetry to fetch the required dependencies.

To install Poetry open Windows Powershell and run the following command (Invoke-WebRequest -Uri https://install.python-poetry.org -UseBasicParsing).Content | py -

Poetry Documentation

Install the required Python modules

After cloning the project and installing Poetry, open a command prompt in the EyeTrackApp folder. Then run the command: poetry install

This should install all of the required modules.

Build the app

Now, you should be ready to build the app. With a command prompt open in the EyeTrackApp folder, run the command poetry run pyinstaller eyetrackapp.spec

Give it time to build the app. Once done, the app should be under dist/eyetrackapp

Released under the MIT License.

- + \ No newline at end of file diff --git a/software_guide/eyetrackvr_app_guide.html b/software_guide/eyetrackvr_app_guide.html index 4238e4b..5f03ff2 100644 --- a/software_guide/eyetrackvr_app_guide.html +++ b/software_guide/eyetrackvr_app_guide.html @@ -20,7 +20,7 @@
Skip to content

How to install, run and adjust the EyeTrackVR app.

Step 1: Download the EyeTrackVR Installer and install the EyeTrackVR app

Go to the latest GitHub release here and download the Setup.exe file.

Follow the prompts and the app should be installed.

Step 2: Run the EyeTrackVR app

If the Create Desktop Shortcut option was checked you should see an icon on your desktop, double click it to run.

You then should be greeted with a GUI that looks like this:

'img'

Step 3: Getting familiar with settings and terminology

Let's go over some basic terminology you will find in the app.

Starting from the top:

Right eye

Shows the right eye feed and settings only.

Left eye

Shows the left eye feed and settings only.

Both eyes

Shows both eyes' feed and settings.

Camera Address

This is where you enter the IP address of your camera. Alternatively, it can be used to put the cam number for wired cameras or pass in a video file.

Tracking Mode

This changes the GUI to the tracking mode where it outputs values.

Cropping Mode

This is where you will crop out your eye.

Threshold

This is used to cut out things that aren't dark like your pupil.

Rotation

For our method to work best, you want your eye to be level. Use this slider to adjust it to where that is the case.

Restart Calibration

This will start a calibration mode for your eye where you look to all extremes.

Recenter Eye

This will recenter your eye to whatever point you are looking at.

Step 4: Adding your cameras to the software and configuring them

Wireless configuration

Power your ESPs and find what the IP address is for your right eye. This can be done by opening both Cameras in a browser and then holding your finger over your right eye camera.

Copy that IP address and then close the browser tab with it open.

Enter that IP address into the app's Camera Address field.

Press the Save and Restart Tracking button.

Wired configuration (experimental)

Note this is a Beta app feature only. Please use the latest Open Beta in the Discord.

Find your ESPs in Device Manager and figure out their COM port numbers, e.g. COM4.

Enter that COM port number into the app's Camera Address field.

Press the Save and Restart Tracking button.

Setting up ROI

Don't see your camera feed? That's because an ROI hasn't been set yet.

'img'

See the Awaiting Eye Cropping Settings text?

Now press the Cropping Mode button. You should see a feed of your camera.

Put your headset on and use an application to see your desktop. (Virtual desktop, SteamVR desktop, etc.)

You should see something like this:

'img'

:O It's my eye!

Now, draw a rectangle that selects your eye.

A good example of an ROI

'img'

Head back over to the Tracking mode.

We will now adjust our rotation by moving the Rotation slider.

From this:

'img'

It's crooked!

To this:

'img'

Much better!

Now we will adjust our threshold.

Continuing with your headset on, move the slider all the way up. start slowly backing it off until mainly only your pupil is being visualized in the threshold viewer.

Example of a threshold being too low:

'img'

Much better!

Example of too high of a threshold:

'img'

Much better!

Example of a good threshold:

'img'

Much better!

Repeat everything in this step for your left eye.

Step 5: Calibrating your eyes

Once your eye is trackable by the software we need to calibrate it.

Press the Restart Calibration button and look around. The important part is that you look to all extremes, all the way up, left, etc. Once you have done so, wait for the Mode to say Tracking

The next step is to center your eye.

Look straight forward and press the Recenter Eye button.

Your eye is now fully calibrated.

Repeat for your other eye.

NOTE

To best center both eyes, look at one spot and hit the 'Recenter Eye' button for each eye without moving your eyes.

Released under the MIT License.

- + \ No newline at end of file diff --git a/software_guide/osc_setup.html b/software_guide/osc_setup.html index 4de57ba..cae188d 100644 --- a/software_guide/osc_setup.html +++ b/software_guide/osc_setup.html @@ -20,7 +20,7 @@
Skip to content

Setting up recalibration and recentering from VRChat.

ETVR has a mechanism that allows you to recalibrate and recenter your tracking without the need to interact with the app directly, here's how to set up your avatar to make use of OSC to trigger this from within VRC itself.

NOTE

Note, while this page explains how to do it from the point of view of vrchat, it can be done in Resonite and CVR too, all the app listens for are two OSC messages.

Setting up your parameters

Adding parameters

You'll need to add two parameters to your Parameters config, those being:

  • etvr_recalibrate: Bool
  • etvr_recenter: Bool

Both of them do not have to be synced, meaning they won't increase your total memory count.

To do this, in Unity, select your avatar on the scene and find Avatar Descriptor in the inspector. In it, find the section named "expressions", expand it and click on the file assigned to the "parameters" field, that's your Parameters config.

Screenshot showing expressions section of vrc avatar descriptor

Clicking on it will highlight the associated asset in the asset browser. Click on the highlighted file to get its content to show in the inspector panel. Now, in the inspector, click on the Add button to add the parameters mentioned above. Both of them need to be of type Bool, but they don't need to be synced nor saved.

Screenshot showing adding parameters to parameter menu
Screenshot showing the params that need to be added

That's it for the parameters.

Adding buttons to the menu

Now, you will need to add actual buttons for them in your gesture menu.

The expressions section mentioned above also contains the menu, you can use that to either add the buttons there directly or create a small submenu to make it less cluttered, here we will add them directly.

Screenshot showing expressions section of vrc avatar descriptor

Once you have the menu file opened in the inspector, click on the "add control" button. This will add an empty action slot, expand it.

Screenshot showing the the empty controls field

You'll see a bunch of fields, like name, icon, type and parameter. Firstly, let's give it a friendly name like "recalibrate eyes" or "recenter tracking" depending on the action you want to assign to them.

Leave the type as is - as a button, it's more convenient as it bounces back into the off state by itself.

Now, in the parameter section, click on the [None] and select one of the parameters you have created in the previous section.

Screenshot showing the the controls setup

Repeat the steps for the other one.

Once done simply build and upload the updated version of your avatar.

And that's it, you now should be able to recenter and recalibrate your tracking form inside of VRC!

Resetting the OSC config in case the parameters don't work

Sometimes VRChat won't regenerate the config files that it uses to communicate the changes to anyone that's listening. To fix this, it's enough to simply delete the auto generated configuration files and let VRC remake them.

They are stored under

C:\Users\<user>\AppData\LocalLow\VRChat\VRChat\OSC\<your user uuid>\Avatars\

Deleting the contents of the Avatars directory will resolve the issue.

Other platforms

Like mentioned in the note, this functionality is not limited to VRC.

The app itself listens to OSC messages sent on port 9001 to two addresses:

  • /avatar/parameters/etvr_recenter
  • /avatar/parameters/etvr_recalibrate

It only needs to receive a Bool value of True to trigger the recenter and calibration.

The port and the addresses can be changed in the app settings, if need be.

Released under the MIT License.

- + \ No newline at end of file