emotiphai_public

A platform for group physiological data collection and retrospective emotion annotation.

https://github.com/patriciabota/emotiphai_public

Science Score: 67.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 2 DOI reference(s) in README
  • Academic publication links
    Links to: springer.com
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (13.7%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

A platform for group physiological data collection and retrospective emotion annotation.

Basic Info
  • Host: GitHub
  • Owner: PatriciaBota
  • License: agpl-3.0
  • Language: HTML
  • Default Branch: main
  • Size: 49.2 MB
Statistics
  • Stars: 0
  • Watchers: 1
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Created over 1 year ago · Last pushed 11 months ago
Metadata Files
Readme License Citation

README.md

EmotiphAI_public

EmotiphAI is a platform developed to address the challenge of collecting physiological data from groups, particularly when a centralised controller is used.

Motivation

The platform is designed not only for real-time biosignal acquisition but also for retrospective emotion annotation. By analyzing Electrodermal Activity (EDA) data, EmotiphAI identifies significant moments in a session (e.g., during a 2-hour movie), allowing for targeted annotation. This approach minimizes distraction during the emotion elicitation process, making it more efficient and user-friendly.

Methods

EmotiphAI is built on a low-cost, standalone local infrastructure, which includes:

  • Hardware:

    • A local hub, such as a Raspberry Pi or Odroid, that serves as the central data receiver.
    • A wearable device, 3D-printed and based on the ESP32 microcontroller, for biosignal acquisition.
  • Communication:

    • Data is transmitted via Bluetooth to the local hub, which is connected through a WiFi router (e.g., TP-Link Wireless N 450Mbps (TL-WR940N)).
    • Multiprocessing is employed to manage simultaneous data reception from multiple devices while optimizing CPU core usage.
  • Software:

    • An end-user interface for real-time data visualization and emotion annotation.

For detailed methodology and technical specifications, refer to the scientific paper available here.

emotiphai_infrastructure

Results

The EmotiphAI platform can:

  • Collect data from up to 30 devices at 50Hz (1 channel), or 10 devices at 100Hz (2 channels).
  • The platform was successfully used to collect a real-world dataset, comprising over 350 hours of data. This dataset is publicly available here.
  • Scientific paper available here.

DEMOs

Aquisition Annotation

Installation

Installation can be easily done with the Clone or Download button above:

bash $ git clone https://github.com/PatriciaBota/EmotiphAI.git

Configuration

  • Configurations can be found at fastapi/src/core/config.py

Run

  1. make create-venv
  2. make install
  3. make run

To get started with EmotiphAI:

  1. Set up the local infrastructure with the required hardware and software.
  2. Deploy the wearable devices to participants.
  3. Use the platform's interface to monitor and annotate data in real-time or retrospectively.

Acknowledge

This work was funded by FCT - Fundação para a Ciência e a Tecnologia under grants 2020.06675.BD and FCT (PCIF/SSO/0163/2019 SafeFire), FCT/MCTES national funds, co-funded EU (UIDB/50008/2020 NICE-HOME), Xinhua Net FMCI (S-0003-LX-18), Ministry of Economy and Competitiveness of the Spanish Government co-founded by ERDF (TIN2017-85409-P PhysComp), and IT - Instituto de Telecomunicacações, by the European Regional Development Fund (FEDER) through the Operational Competitiveness and Internationalization Programme (COMPETE 2020), and by National Funds (OE) through the FCT under the LISBOA-01-0247-FEDER-069918 “CardioLeather” and LISBOA-1-0247-FEDER-113480 “EpilFootSense”.

Owner

  • Login: PatriciaBota
  • Kind: user

Citation (CITATION.cff)

cff-version: 1.2.0
message: "If you use this code, please cite the following paper."
authors:
  - family-names: Bota
    given-names: Patrícia
  - family-names: Flety
    given-names: Emmanuel
  - family-names: Silva
    given-names: Hugo Plácido da
  - family-names: Fred
    given-names: Ana
title: "EmotiphAI: a biocybernetic engine for real-time biosignals acquisition in a collective setting"
journal: "Neural Computing and Applications"
volume: "35"
number: "8"
pages: "5721-5736"
year: 2023
publisher: "Springer"
doi: "10.1007/s00521-023-07762-7"
url: "https://doi.org/10.1007/s00521-023-07762-7"

GitHub Events

Total
Last Year

Dependencies

requirements.txt pypi
  • Jinja2 ==3.1.2
  • MarkupSafe ==2.1.3
  • Pillow ==10.1.0
  • PyYAML ==6.0.1
  • SQLAlchemy ==2.0.9
  • aiofiles ==23.2.1
  • annotated-types ==0.6.0
  • anyio ==3.7.1
  • bidict ==0.22.1
  • biosppy ==1.0.0
  • certifi ==2023.7.22
  • charset-normalizer ==3.3.2
  • click ==8.1.7
  • contourpy ==1.2.0
  • cycler ==0.12.1
  • exceptiongroup ==1.1.3
  • fastapi ==0.104.1
  • fastapi-socketio ==0.0.10
  • ffmpeg ==1.4
  • fonttools ==4.44.3
  • future ==0.18.3
  • greenlet ==3.0.1
  • h11 ==0.14.0
  • h5py ==3.10.0
  • idna ==3.4
  • iso8601 ==2.1.0
  • itsdangerous ==2.1.2
  • joblib ==1.3.2
  • kiwisolver ==1.4.5
  • matplotlib ==3.8.1
  • netifaces ==0.11.0
  • numpy ==1.26.2
  • opencv-python ==4.7.0.72
  • packaging ==23.2
  • pandas ==2.1.3
  • platformdirs ==4.0.0
  • pydantic ==2.5.1
  • pydantic_core ==2.14.3
  • pyparsing ==3.1.1
  • python-dateutil ==2.8.2
  • python-engineio ==4.4.1
  • python-multipart ==0.0.6
  • python-socketio ==5.7.2
  • pytz ==2023.3.post1
  • requests ==2.31.0
  • requests-unixsocket ==0.2.0
  • scenedetect ==0.6.2
  • scikit-learn ==1.3.2
  • scipy ==1.11.3
  • serial ==0.0.97
  • shortuuid ==1.0.11
  • simple-websocket ==1.0.0
  • six ==1.16.0
  • sniffio ==1.3.0
  • socketIO-client ==0.7.2
  • starlette ==0.27.0
  • structlog ==23.2.0
  • threadpoolctl ==3.2.0
  • tqdm ==4.66.1
  • typing_extensions ==4.8.0
  • tzdata ==2023.3
  • urllib3 ==2.1.0
  • uvicorn ==0.24.0.post1
  • websocket-client ==1.5.1
  • websockets ==11.0.3
  • wsproto ==1.2.0
requirements_mac.txt pypi
  • Jinja2 ==3.1.2
  • MarkupSafe ==2.1.3
  • Pillow ==10.1.0
  • PyYAML ==6.0.1
  • SQLAlchemy ==2.0.9
  • aiofiles ==23.2.1
  • annotated-types ==0.6.0
  • anyio ==3.7.1
  • bidict ==0.22.1
  • biosppy ==1.0.0
  • certifi ==2023.7.22
  • charset-normalizer ==3.3.2
  • click ==8.1.7
  • contourpy ==1.2.0
  • cycler ==0.12.1
  • exceptiongroup ==1.1.3
  • fastapi ==0.104.1
  • fastapi-socketio ==0.0.10
  • ffmpeg ==1.4
  • fonttools ==4.44.3
  • future ==0.18.3
  • greenlet ==3.0.1
  • h11 ==0.14.0
  • h5py ==3.10.0
  • idna ==3.4
  • iso8601 ==2.1.0
  • itsdangerous ==2.1.2
  • joblib ==1.3.2
  • kiwisolver ==1.4.5
  • matplotlib ==3.8.1
  • netifaces ==0.11.0
  • numpy ==1.26.2
  • opencv-python ==4.7.0.72
  • packaging ==23.2
  • pandas ==2.1.3
  • platformdirs ==4.0.0
  • pydantic ==2.5.1
  • pydantic_core ==2.14.3
  • pyparsing ==3.1.1
  • python-dateutil ==2.8.2
  • python-engineio ==4.4.1
  • python-multipart ==0.0.6
  • python-socketio ==5.7.2
  • pytz ==2023.3.post1
  • requests ==2.31.0
  • requests-unixsocket ==0.2.0
  • scenedetect ==0.6.2
  • scikit-learn ==1.3.2
  • scipy ==1.11.3
  • serial ==0.0.97
  • shortuuid ==1.0.11
  • simple-websocket ==1.0.0
  • six ==1.16.0
  • sniffio ==1.3.0
  • socketIO-client ==0.7.2
  • starlette ==0.27.0
  • structlog ==23.2.0
  • threadpoolctl ==3.2.0
  • tqdm ==4.66.1
  • typing_extensions ==4.8.0
  • tzdata ==2023.3
  • urllib3 ==2.1.0
  • uvicorn ==0.24.0.post1
  • websocket-client ==1.5.1
  • websockets ==11.0.3
  • wsproto ==1.2.0