https://github.com/astrazeneca/multimodal-python-course
The purpose of the code is to facilitate a comprehensive understanding of multimodal data science applications within medical domain. The code serves to support the delivery of a cutting-edge workshop designed to introduce researchers to the rapidly evolving field of multimodal data science
Science Score: 49.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 1 DOI reference(s) in README -
✓Academic publication links
Links to: arxiv.org, medrxiv.org, ncbi.nlm.nih.gov, nature.com -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (7.2%) to scientific vocabulary
Repository
The purpose of the code is to facilitate a comprehensive understanding of multimodal data science applications within medical domain. The code serves to support the delivery of a cutting-edge workshop designed to introduce researchers to the rapidly evolving field of multimodal data science
Basic Info
- Host: GitHub
- Owner: AstraZeneca
- License: apache-2.0
- Language: Jupyter Notebook
- Default Branch: main
- Size: 11.2 MB
Statistics
- Stars: 9
- Watchers: 2
- Forks: 0
- Open Issues: 0
- Releases: 0
Metadata Files
README.md

Navigating the Multimodal Map: Insights into Foundation Models
Venue

Online training course run by the NextGen Data Scientists, AstraZeneca
Trainers
Sylwia Majchrowska, Ricardo Mokhtari
Course structure and links
Day | Title | Activity | Materials | :---:|:-----:|:--------:|:---------:| 0 | Troubleshooting software installations | preparation | Introduction and installations | 1 | SAM Concept Cove | Session | Materials | 2 | Multimodal data handling | Session | Materials |
References
- LangSAM Code
- Grounding DiNO Code Paper
- SAM Code Paper
- Attention illustrated blog
- Attention video
- Another attention video
- Cross attention
- Visualise a transformer
- The RSNA-ASNR-MICCAI BraTS 2021 Benchmark on Brain Tumor Segmentation and Radiogenomic Classification
- MMML Tutorial - ICML 2023
- Multimodal data fusion – analysis
- Fusion of Multi-Modal Data Stream for Clinical Event Prediction - Imon Banerjee, PhD
- Data-Efficient Multimodal Fusion on a Single GPU
- Integrated multimodal artificial intelligence framework for healthcare applications
- Inferring multimodal latent topics from electronic health records
- Multimodal Risk Prediction with Physiological Signals, Medical Images and Clinical Notes
Owner
- Name: AstraZeneca
- Login: AstraZeneca
- Kind: organization
- Location: Global
- Website: https://www.astrazeneca.com/
- Repositories: 33
- Profile: https://github.com/AstraZeneca
Data and AI: Unlocking new science insights
GitHub Events
Total
- Watch event: 3
- Push event: 1
Last Year
- Watch event: 3
- Push event: 1