<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Awesome on Hanguangwu</title><link>https://hanguangwu.github.io/blog/en/categories/awesome/</link><description>Recent content in Awesome on Hanguangwu</description><generator>Hugo -- gohugo.io</generator><language>en</language><copyright>hanguangwu</copyright><lastBuildDate>Mon, 23 Mar 2026 13:34:25 -0800</lastBuildDate><atom:link href="https://hanguangwu.github.io/blog/en/categories/awesome/index.xml" rel="self" type="application/rss+xml"/><item><title>GitHub Repo Deep-Learning-Based-Image-Compression</title><link>https://hanguangwu.github.io/blog/en/p/github-repo-deep-learning-based-image-compression/</link><pubDate>Mon, 23 Mar 2026 13:34:25 -0800</pubDate><guid>https://hanguangwu.github.io/blog/en/p/github-repo-deep-learning-based-image-compression/</guid><description>&lt;h1 id="awesome-public-datasets"&gt;Awesome Public Datasets
&lt;/h1&gt;&lt;h2 id="introduction"&gt;Introduction
&lt;/h2&gt;&lt;p&gt;&lt;a class="link" href="https://github.com/ppingzhang/Deep-Learning-Based-Image-Compression" target="_blank" rel="noopener"
&gt;The paper list about deep learning based image compression&lt;/a&gt;&lt;/p&gt;
&lt;h2 id="paper-list"&gt;Paper List
&lt;/h2&gt;&lt;h3 id="generative-compression"&gt;Generative compression
&lt;/h3&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th style="text-align: left"&gt;Title&lt;/th&gt;
&lt;th style="text-align: center"&gt;Pub. &amp;amp; Date&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://www.ecva.net/papers/eccv_2024/papers_ECCV/papers/07338.pdf" target="_blank" rel="noopener"
&gt;Rate-Distortion-Cognition Controllable Versatile Neural Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ECCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://www.ecva.net/papers/eccv_2024/papers_ECCV/papers/07844.pdf" target="_blank" rel="noopener"
&gt;Lossy Image Compression with Foundation Diffusion Models&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ECCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://www.ecva.net/papers/eccv_2024/papers_ECCV/papers/05155.pdf" target="_blank" rel="noopener"
&gt;EGIC: Enhanced Low-Bit-Rate Generative Image Compression Guided by Semantic Segmentation&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ECCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2507.04947" target="_blank" rel="noopener"
&gt;DC-AR: Efficient Masked Autoregressive Image Generation with Deep Compression Hybrid Tokenizer&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2506.21977" target="_blank" rel="noopener"
&gt;StableCodec: Taming One-Step Diffusion for Extreme Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://iccv.thecvf.com/virtual/2025/poster/577" target="_blank" rel="noopener"
&gt;DLF: Extreme Image Compression with Dual-generative Latent Fusion&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://iccv.thecvf.com/virtual/2025/poster/2681" target="_blank" rel="noopener"
&gt;Cross-Granularity Online Optimization with Masked Compensated Information for Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2025/papers/Xu_Decouple_Distortion_from_Perception_Region_Adaptive_Diffusion_for_Extreme-low_Bitrate_CVPR_2025_paper.pdf" target="_blank" rel="noopener"
&gt;Decouple Distortion from Perception: Region Adaptive Diffusion for Extreme-low Bitrate Perception Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;CVPR 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openreview.net/forum?id=xiVuqZZ59O" target="_blank" rel="noopener"
&gt;Ultra Lowrate Image Compression with Semantic Residual Coding and Compression-aware Diffusion&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICML 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openreview.net/pdf?id=qi7udwV66M" target="_blank" rel="noopener"
&gt;Zero-Shot Image Compression with Diffusion-Based Posterior Sampling&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openreview.net/forum?id=z0hUsPhwUN" target="_blank" rel="noopener"
&gt;Once-for-All: Controllable Generative Image Compression with Dynamic Granularity Adaptation&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICLR 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ojs.aaai.org/index.php/AAAI/article/view/33403" target="_blank" rel="noopener"
&gt;Conditional Latent Coding with Learnable Synthesized Reference for Deep Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;AAAI 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ojs.aaai.org/index.php/AAAI/article/view/33175" target="_blank" rel="noopener"
&gt;GLIC: General Format Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;AAAI 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2411.10185v3" target="_blank" rel="noopener"
&gt;Efficient Progressive Image Compression with Variance-aware Masking&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="None" &gt;UniMIC: Towards Universal Multi-modality Perceptual Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2412.10935v2" target="_blank" rel="noopener"
&gt;Progressive Compression with Universally Quantized Diffusion Models&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2412.11379v1" target="_blank" rel="noopener"
&gt;Controllable Distortion-Perception Tradeoff Through Latent Diffusion for Neural Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;AAAI 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2407.19651" target="_blank" rel="noopener"
&gt;ComNeck: Bridging Compressed Image Latents and Multimodal LLMs via Universal Transform-Neck&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2412.12982v1" target="_blank" rel="noopener"
&gt;Stable Diffusion is a Natural Cross-Modal Decoder for Layered AI-generated Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2407.12538v1" target="_blank" rel="noopener"
&gt;Linearly transformed color guide for low-bitrate diffusion based image compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;TIP 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2408.08459" target="_blank" rel="noopener"
&gt;JPEG-LM: LLMs as Image Generators with Canonical Codec Representations&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10566414&amp;amp;casa_token=4U0sgUNsxyQAAAAA:0ayUIqrQmKrwfM8v1sE67ZZaS48OiReJjRZdRqHyTlnCHI4zm_PSEqwM4QsvNI7qccQzSXg" target="_blank" rel="noopener"
&gt;Image Encryption and Compression Based on Reversed Diffusion Model&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;PCS 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2406.00758" target="_blank" rel="noopener"
&gt;Once-for-All: Controllable Generative Image Compression with Dynamic Granularity Adaption&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10570244&amp;amp;casa_token=xkZkXmlgP3wAAAAA:DYmBBrPQf2IwWoUAF70Te7XtdfSg85ud771PVI_vkfwCbjPUTB1cGuM3k_levF40o4NmV-s" target="_blank" rel="noopener"
&gt;Machine Perception-Driven Facial Image Compression: A Layered Generative Approach&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;TCSVT 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2407.07723" target="_blank" rel="noopener"
&gt;Understanding is Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2407.17060" target="_blank" rel="noopener"
&gt;High Efficiency Image Compression for Large Visual-Language Models&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openreview.net/pdf?id=nSUMQhITdd" target="_blank" rel="noopener"
&gt;Consistency Guided Diffusion Model with Neural Syntax for Perceptual Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ACM MM 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2407.09896" target="_blank" rel="noopener"
&gt;Zero-Shot Image Compression with Diffusion-Based Posterior Sampling&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2407.12538" target="_blank" rel="noopener"
&gt;High Frequency Matters: Uncertainty Guided Image Compression with Wavelet Diffusion&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2407.12295" target="_blank" rel="noopener"
&gt;Exploiting Inter-Image Similarity Prior for Low-Bitrate Remote Sensing Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2406.03961" target="_blank" rel="noopener"
&gt;LDM-RSIC: Exploring Distortion Prior with Latent Diffusion Models for Remote Sensing Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2024/papers/Jia_Generative_Latent_Coding_for_Ultra-Low_Bitrate_Image_Compression_CVPR_2024_paper.pdf" target="_blank" rel="noopener"
&gt;Generative Latent Coding for Ultra-Low Bitrate Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;CVPR 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2406.00758" target="_blank" rel="noopener"
&gt;Once-for-All: Controllable Generative Image Compression with Dynamic Granularity Adaption&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/abs/2406.09356" target="_blank" rel="noopener"
&gt;CMC-Bench: Towards a New Paradigm of Visual Signal Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id="lossless-compression"&gt;Lossless Compression
&lt;/h3&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th style="text-align: left"&gt;Title&lt;/th&gt;
&lt;th style="text-align: center"&gt;Pub. &amp;amp; Date&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2025/papers/Zhang_Fitted_Neural_Lossless_Image_Compression_CVPR_2025_paper.pdf" target="_blank" rel="noopener"
&gt;Fitted Neural Lossless Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;CVPR 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2411.12448" target="_blank" rel="noopener"
&gt;Large Language Models for Lossless Image Compression: Next-Pixel Prediction in Language Space is All You Need&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;NeurIPS 2026&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2509.07704" target="_blank" rel="noopener"
&gt;SEEC: Segmentation-Assisted Multi-Entropy Models for Learned Lossless Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/abs/2412.17464" target="_blank" rel="noopener"
&gt;CALLIC: Content Adaptive Learning for Lossless Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;AAAI 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2410.17814" target="_blank" rel="noopener"
&gt;Learning Lossless Compression for High Bit-Depth Volumetric Medical Image&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10647302&amp;amp;casa_token=xbYddfRSqMoAAAAA:19cLT7kxdjVYv0j84IsNlUYujos72wpW_2phbqj45fjq-mNwLktHwGzZwENu4faVl1nvkhA" target="_blank" rel="noopener"
&gt;Rate-Complexity Optimization in Lossless Neural-Based Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICIP 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2412.00369v1" target="_blank" rel="noopener"
&gt;Random Cycle Coding: Lossless Compression of Cluster Assignments via Bits-Back Coding&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://www.sciencedirect.com/science/article/abs/pii/S0031320324003832" target="_blank" rel="noopener"
&gt;Hybrid-context-based multi-prior entropy modeling for learned lossless image compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;Pattern Recognition 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2024/papers/Zhang_Learned_Lossless_Image_Compression_based_on_Bit_Plane_Slicing_CVPR_2024_paper.pdf" target="_blank" rel="noopener"
&gt;Learned Lossless Image Compression based on Bit Plane Slicing&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;Pattern Recognition 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id="variable-rate--scalable-compression"&gt;Variable Rate / Scalable Compression
&lt;/h3&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th style="text-align: left"&gt;Title&lt;/th&gt;
&lt;th style="text-align: center"&gt;Pub. &amp;amp; Date&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openreview.net/pdf?id=1groaXTrKo" target="_blank" rel="noopener"
&gt;Towards Scalable Compression with Universally Quantized Diffusion Models&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;NeurIPSW 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://neurips.cc/virtual/2024/98246" target="_blank" rel="noopener"
&gt;Flexible image decoding in learned image compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;NeurIPSW 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/1907.07875v1" target="_blank" rel="noopener"
&gt;Variable-size Symmetry-based Graph Fourier Transforms for image compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2207.04324v2" target="_blank" rel="noopener"
&gt;Latent Variables Coding for Perceptual Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ACM MM 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2410.00557" target="_blank" rel="noopener"
&gt;STanH: Parametric Quantization for Variable Rate Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2405.14222" target="_blank" rel="noopener"
&gt;RAQ-VAE: Rate-Adaptive Vector-Quantized Variational Autoencoder&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;Arxiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id="quantization"&gt;Quantization
&lt;/h3&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th style="text-align: left"&gt;Title&lt;/th&gt;
&lt;th style="text-align: center"&gt;Pub. &amp;amp; Date&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2025/papers/Relic_Bridging_the_Gap_between_Gaussian_Diffusion_Models_and_Universal_Quantization_CVPR_2025_paper.pdf" target="_blank" rel="noopener"
&gt;Bridging the Gap between Gaussian Diffusion Models and Universal Quantization for Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;CVPR 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openreview.net/pdf?id=wqN6rWwYsr" target="_blank" rel="noopener"
&gt;Bridging the Gap between Diffusion Models and Universal Quantization for Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;NeurIPSW 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2411.16119v1" target="_blank" rel="noopener"
&gt;Learning Optimal Lattice Vector Quantizers for End-to-end Neural Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;amp;arnumber=10689618" target="_blank" rel="noopener"
&gt;Convolution Filter Compression via Sparse Linear Combinations of Quantized Basis&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;TNNLS 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2409.09488v1" target="_blank" rel="noopener"
&gt;Lossy Image Compression with Stochastic Quantization&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://www.arxiv.org/pdf/2408.12691" target="_blank" rel="noopener"
&gt;Quantization-free Lossy Image Compression Using Integer Matrix Factorization&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2408.12150" target="_blank" rel="noopener"
&gt;DeepHQ: Learned Hierarchical Quantizer for Progressive Deep Image Coding&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10566339&amp;amp;casa_token=6MnXai1ergEAAAAA:98ttJhOF_UU12y_KPlwG0kWpI35xBScxcKz4gIbyAdOow-5pe4hasuqIPeC7nBrnavlgr7Y" target="_blank" rel="noopener"
&gt;A Quantization Loss Compensation Network for Remote Sensing Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;PCS 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2406.07548" target="_blank" rel="noopener"
&gt;Image and Video Tokenization with Binary Spherical Quantization&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10531761" target="_blank" rel="noopener"
&gt;NLIC: Non-uniform Quantization based Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;TCSVT 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id="entropy-model"&gt;Entropy Model
&lt;/h3&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th style="text-align: left"&gt;Title&lt;/th&gt;
&lt;th style="text-align: center"&gt;Pub. &amp;amp; Date&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/abs/2507.19125" target="_blank" rel="noopener"
&gt;Learned Image Compression with Hierarchical Progressive Context Modeling&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openreview.net/forum?id=bsnRUkVn63" target="_blank" rel="noopener"
&gt;Test-time Adaptation for Image Compression with Distribution Regularization&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICLR 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2509.05169" target="_blank" rel="noopener"
&gt;Exploring Autoregressive Vision Foundation Models for Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openreview.net/pdf?id=J28aP5HsRJ" target="_blank" rel="noopener"
&gt;Learned Image Compression Framework with Quad-Prior Entropy Model&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2509.18815" target="_blank" rel="noopener"
&gt;FlashGMM: Fast Gaussian Mixture Entropy Model for Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2411.19320v1" target="_blank" rel="noopener"
&gt;Generalized Gaussian Model for Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2501.12330v1" target="_blank" rel="noopener"
&gt;The Gap Between Principle and Practice of Lossy Image Coding&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2405.09152v5" target="_blank" rel="noopener"
&gt;Group Image Compression for Dual Use of Machine and Human Vision&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;TCSVT 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2411.05832v1" target="_blank" rel="noopener"
&gt;Diversify, Contextualize, and Adapt: Efficient Entropy Modeling for Neural Image Codec&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2410.07669" target="_blank" rel="noopener"
&gt;Delta-ICM: Entropy Modeling with Delta Function for Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2410.04847" target="_blank" rel="noopener"
&gt;Causal Context Adjustment Loss for Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;NeurIPS 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openreview.net/pdf?id=YTNN0mOPQN" target="_blank" rel="noopener"
&gt;Spatial-Temporal Context Model for Remote Sensing Imagery Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ACM MM 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2407.09983" target="_blank" rel="noopener"
&gt;WeConvene: Learned Image Compression with Wavelet-Domain Convolution and Entropy Model&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2407.11590" target="_blank" rel="noopener"
&gt;Rethinking Learned Image Compression: Context is All You Need&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2407.10632" target="_blank" rel="noopener"
&gt;Bidirectional Stereo Image Compression with Cross-Dimensional Entropy Model&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id="achitecture"&gt;Achitecture
&lt;/h3&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th style="text-align: left"&gt;Title&lt;/th&gt;
&lt;th style="text-align: center"&gt;Pub. &amp;amp; Date&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://www.ecva.net/papers/eccv_2024/papers_ECCV/papers/06635.pdf" target="_blank" rel="noopener"
&gt;WeConvene: Learned Image Compression with Wavelet-Domain Convolution and Entropy Model&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ECCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://www.ecva.net/papers/eccv_2024/papers_ECCV/papers/06270.pdf" target="_blank" rel="noopener"
&gt;Region-Adaptive Transform with Segmentation Prior for Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ECCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://www.ecva.net/papers/eccv_2024/papers_ECCV/papers/03640.pdf" target="_blank" rel="noopener"
&gt;BaSIC: BayesNet Structure Learning for Computational Scalable Neural Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ECCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2509.10366" target="_blank" rel="noopener"
&gt;Efficient Learned Image Compression Through Knowledge Distillation&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://iccv.thecvf.com/virtual/2025/poster/2181" target="_blank" rel="noopener"
&gt;Cassic: Towards Content-Adaptive State-Space Models for Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2025/papers/Xu_PICD_Versatile_Perceptual_Image_Compression_with_Diffusion_Rendering_CVPR_2025_paper.pdf" target="_blank" rel="noopener"
&gt;PICD: Versatile Perceptual Image Compression with Diffusion Rendering&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;CVPR 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2025/papers/Zeng_MambaIC_State_Space_Models_for_High-Performance_Learned_Image_Compression_CVPR_2025_paper.pdf" target="_blank" rel="noopener"
&gt;MambaIC: State Space Models for High-Performance Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;CVPR 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openreview.net/pdf?id=gIrVoQEDQv" target="_blank" rel="noopener"
&gt;Unraveling Neural Cellular Automata for Lightweight Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openreview.net/pdf?id=Tv36j85SqR" target="_blank" rel="noopener"
&gt;Approaching Rate-Distortion Limits in Neural Compression with Lattice Transform Coding&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICLR 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2411.18494v1" target="_blank" rel="noopener"
&gt;Learning Optimal Linear Block Transform by Rate Distortion Minimization&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2412.15752v1" target="_blank" rel="noopener"
&gt;Sparse Point Clouds Assisted Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;TCSVT 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2501.13751v1" target="_blank" rel="noopener"
&gt;On Disentangled Training for Nonlinear Transform in Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2410.12191" target="_blank" rel="noopener"
&gt;Test-time adaptation for image compression with distribution regularization&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2409.18730" target="_blank" rel="noopener"
&gt;Effectiveness of learning-based image codecs on fingerprint storage&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/abs/2410.02981" target="_blank" rel="noopener"
&gt;Gabic: Graph-Based Attention Block for Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICIP 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2409.17134" target="_blank" rel="noopener"
&gt;Streaming Neural Images&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICIP 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://www.arxiv.org/pdf/2408.03842" target="_blank" rel="noopener"
&gt;Bi-Level Spatial and Channel-aware Transformer for Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10743248&amp;amp;casa_token=uVcLjjVsiIAAAAAA:umWqK3-lWEAaYZLS6bGRwU83D_HltSVBFOPPF547AAOr-fKWKk4cWWscip13hDKI1ZYlPoc" target="_blank" rel="noopener"
&gt;Extreme Low Bitrate Image Compression System for Mobile Deployment&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;MMSP 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2409.14090" target="_blank" rel="noopener"
&gt;Window-based Channel Attention for Wavelet-enhanced Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10647907&amp;amp;casa_token=_xL4m5ekrn0AAAAA:c7C1H9icT_KyIsjmgCz2uuikwvp8ukPivv5cDm_3V5nCspElz4BQXWWPxnrtmZmGv4pYddY" target="_blank" rel="noopener"
&gt;Feature Enhanced Learning Image Compression With Recurrent Criss-Cross Attention&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICIP 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2408.17073" target="_blank" rel="noopener"
&gt;Approximately Invertible Neural Network for Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2408.14127" target="_blank" rel="noopener"
&gt;Rate-Distortion-Perception Controllable Joint Source-Channel Coding for High-Fidelity Generative Communications&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;Arxiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10648236/authors#authors" target="_blank" rel="noopener"
&gt;Structured Pruning and Quantization for Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICIP 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10566341&amp;amp;casa_token=5IwoTIplk3sAAAAA:qmSZUREE9iZFM3FtnOzIscEwUAonnBfKeBw8tRob7l35ZWuRRaxxcKx68NXw8vRraaBVmrU" target="_blank" rel="noopener"
&gt;Practical Learned Image Compression with Online Encoder Optimization&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;PCS 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2406.10361" target="_blank" rel="noopener"
&gt;On Efficient Neural Network Architectures for Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2406.13709" target="_blank" rel="noopener"
&gt;A Study on the Effect of Color Spaces in Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10558571&amp;amp;casa_token=7OHwnFHkwDUAAAAA:fZ9rVL-B_QI8BT4AWEJkS8-M07rg9VWUxSY3Z1MBlWqoNQtpc4l9wDjz4uchHFS2SPZErEI" target="_blank" rel="noopener"
&gt;Learning-Based Conditional Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ISCAS 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10558635&amp;amp;casa_token=iR30sgfqXX0AAAAA:CygeYdTY8WGiAaUw68kNTiQAcmmiu1nSCbQ13daszhrMk4SO72ODDxLDgjAmHnlCXWRBwBs" target="_blank" rel="noopener"
&gt;Asymmetric Neural Image Compression with High-Preserving Information&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ISCAS 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10566428&amp;amp;casa_token=wYpGkb8wjkQAAAAA:xImfyLYnypOrxhvo6O4UHwHGsOVstRa_6jbBbmRMPdlJLMkBZsULXdcdHJ2wWnVIxkZkmsI" target="_blank" rel="noopener"
&gt;Wavelet-like Transform with Subbands Fusion in Decoupled Structure for Deep Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;PCS 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10559830&amp;amp;casa_token=FWJQglVJO3MAAAAA:BTaIvWu6YnP42QFsGfQak48wjhoAfmxhLVSZjJX-kgjRJ-2dH3y3tteKQn8h5-U-YCZP-IE" target="_blank" rel="noopener"
&gt;FDNet: Frequency Decomposition Network for Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;TCSVT 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2407.09853" target="_blank" rel="noopener"
&gt;Image Compression for Machine and Human Vision with Spatial-Frequency Adaptation&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2407.11700" target="_blank" rel="noopener"
&gt;Rate-Distortion-Cognition Controllable Versatile Neural Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2405.15413" target="_blank" rel="noopener"
&gt;MambaVC: Learned Visual Compression with Selective State Spaces&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id="screen-content-image"&gt;Screen Content Image
&lt;/h3&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th style="text-align: left"&gt;Title&lt;/th&gt;
&lt;th style="text-align: center"&gt;Pub. &amp;amp; Date&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://www.ecva.net/papers/eccv_2024/papers_ECCV/papers/06526.pdf" target="_blank" rel="noopener"
&gt;Learned HDR Image Compression for Perceptually Optimal Storage and Display&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ECCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://www.ijcai.org/proceedings/2024/0134.pdf" target="_blank" rel="noopener"
&gt;Efficient Screen Content Image Compression via Superpixel-based Content Aggregation and Dynamic Feature Fusion&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;IJCAI 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10577165&amp;amp;casa_token=ddUlyV468d4AAAAA:Ep5T9S4nD7zCZWS-ml46aRYuuKqAYMW518K3gLntWQ7GDCjuPpxRY5M7B7UtF42qZ_KiiuU&amp;amp;tag=1" target="_blank" rel="noopener"
&gt;DSCIC: Deep Screen Content Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;TCSVT 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id="hdr-image"&gt;HDR Image
&lt;/h3&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th style="text-align: left"&gt;Title&lt;/th&gt;
&lt;th style="text-align: center"&gt;Pub. &amp;amp; Date&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2407.13179v1" target="_blank" rel="noopener"
&gt;Breaking Boundaries: Unifying Imaging and Compression for HDR Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;TIP 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2407.13179" target="_blank" rel="noopener"
&gt;Learned HDR Image Compression for Perceptually Optimal Storage and Display&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id="image-coding-for-machine-vision"&gt;Image coding for machine vision
&lt;/h3&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th style="text-align: left"&gt;Title&lt;/th&gt;
&lt;th style="text-align: center"&gt;Pub. &amp;amp; Date&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://www.ecva.net/papers/eccv_2024/papers_ECCV/papers/06823.pdf" target="_blank" rel="noopener"
&gt;Image Compression for Machine and Human Vision With Spatial-Frequency Adaptation&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ECCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://www.ecva.net/papers/eccv_2024/papers_ECCV/papers/09009.pdf" target="_blank" rel="noopener"
&gt;A Unified Image Compression Method for Human Perception and Multiple Vision Tasks&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ECCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://dl.acm.org/doi/10.1145/3708347" target="_blank" rel="noopener"
&gt;Neural Image Compression with Regional Decoding&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ToMM 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2402.08862v1" target="_blank" rel="noopener"
&gt;Saliency Segmentation Oriented Deep Image Compression With Novel Bit Allocation&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;TIP 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="None" &gt;LL-ICM: Image Compression for Low-level Machine Vision via Large Vision-Language Model&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2310.09382v1" target="_blank" rel="noopener"
&gt;Task-Adapted Learnable Embedded Quantization for Scalable Human-Machine Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;TCSVT 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2501.04329v1" target="_blank" rel="noopener"
&gt;An Efficient Adaptive Compression Method for Human Perception and Machine Vision Tasks&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2501.04579v1" target="_blank" rel="noopener"
&gt;Unified Coding for Both Human Perception and Generalized Machine Analytics with CLIP Supervision&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2409.19660v1" target="_blank" rel="noopener"
&gt;All-in-One Image Coding for Joint Human-Machine Vision with Multi-Path Aggregation&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2408.08575" target="_blank" rel="noopener"
&gt;Tell Codec What Worth Compressing: Semantically Disentangled Image Coding for Machine with LMMs&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2408.07028" target="_blank" rel="noopener"
&gt;Feature-Preserving Rate-Distortion Optimization in Image Coding for Machines&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="" &gt;Group Image Compression for Dual Use of Machine and Human Vision&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;TCSVT 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2408.07028" target="_blank" rel="noopener"
&gt;Feature-Preserving Rate-Distortion Optimization in Image Coding for Machines&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;MMSP 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10743309&amp;amp;casa_token=zKA0n7bsqFUAAAAA:HAwTji45HCcml__D27xCp29vhfB8Im2TXKbHm29ObXI80UW3kiaW4ckTorJJC7p1cZGUS5Y" target="_blank" rel="noopener"
&gt;Compression of Self-Supervised Representations for Machine Vision&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10647464&amp;amp;casa_token=____eHFo8BMAAAAA:U-jtu0xTn0RWA80FDfNvfith5yJz0sdvRTl5UhTQBhG_J874g9eNBXllfFgFRByMqDnY1zI&amp;amp;tag=1" target="_blank" rel="noopener"
&gt;Learned Image Compression for Both Humans and Machines via Dynamic Adaptation&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICIP 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10648033?casa_token=H_-iMbpng6oAAAAA:zbDs9boDRETBQINfnLEbkz31FcWDyoORoBTCrmmlqXzN86tKR6sqdmXIAA-uHmVH1agtBxsCZw" target="_blank" rel="noopener"
&gt;Image Coding For Machine Via Analytics-Driven Appearance Redundancy Reduction&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICIP 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10574324&amp;amp;casa_token=3hjufBt4DOEAAAAA:ZVH9S11WP5wB3eRmfHs02WCpHHe4_7cHo1SWnMNBuwaCoOJgkxOWk3UXhyUBlAVpCW4fgy4" target="_blank" rel="noopener"
&gt;Saliency Map-Guided End-to-End Image Coding for Machines&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;SPL 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10557851&amp;amp;casa_token=Fu-eEJDIq1gAAAAA:ap6uExZfQWevfhbLwgq3NoH-Q3SR4UBhsSFF7tnnAMTTsZjDPpUz73J0dSMhwR0B0iwQgH8" target="_blank" rel="noopener"
&gt;Redundancy Removal Module for Reducing the Bitrates of Image Coding for Machines&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ISCAS 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id="medical-image"&gt;Medical Image
&lt;/h3&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th style="text-align: left"&gt;Title&lt;/th&gt;
&lt;th style="text-align: center"&gt;Pub. &amp;amp; Date&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2412.09231v1" target="_blank" rel="noopener"
&gt;Versatile Volumetric Medical Image Coding for Human-Machine Vision&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2405.16850" target="_blank" rel="noopener"
&gt;UniCompress: Enhancing Multi-Data Medical Image Compression with Knowledge Distillation&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;Arxiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id="implicit-neural-representation"&gt;Implicit Neural Representation
&lt;/h3&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th style="text-align: left"&gt;Title&lt;/th&gt;
&lt;th style="text-align: center"&gt;Pub. &amp;amp; Date&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openreview.net/forum?id=9u5hPIcr6j" target="_blank" rel="noopener"
&gt;LotteryCodec: Searching the Implicit Representation in a Random Network for Low-Complexity Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICML 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2509.18748" target="_blank" rel="noopener"
&gt;HyperCool: Reducing Encoding Cost in Overfitted Codecs with Hypernetworks&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;Arxiv 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10647328" target="_blank" rel="noopener"
&gt;Redefining Visual Quality: The Impact of Loss Functions on INR-Based Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICIP 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10647328?casa_token=4zOGbEd8ye4AAAAA:HK-ntiQYpO25P-fk_Dob31eeKFZOJ4CFqwOTT5ZaivzBkAUTfcXvoLWxHeaPhoH6K2_BtZHF-A" target="_blank" rel="noopener"
&gt;Implicit Neural Image Field for Biological Microscopy Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICIP 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id="panoramicstereo-image"&gt;Panoramic/stereo Image
&lt;/h3&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th style="text-align: left"&gt;Title&lt;/th&gt;
&lt;th style="text-align: center"&gt;Pub. &amp;amp; Date&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="eccv2024.ecva.net//virtual/2024/poster/1797" &gt;Bidirectional Stereo Image Compression with Cross-Dimensional Entropy Model&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ECCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10721338/authors#authors" target="_blank" rel="noopener"
&gt;Learning Content-Weighted Pseudocylindrical Representation for 360° Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICIP 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id="benchmark--dataset--survey"&gt;Benchmark &amp;amp; Dataset &amp;amp; Survey
&lt;/h3&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th style="text-align: left"&gt;Title&lt;/th&gt;
&lt;th style="text-align: center"&gt;Pub. &amp;amp; Date&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/document/10807668/" target="_blank" rel="noopener"
&gt;JPEG AI: The First International Standard for Image Coding Based on an End-to-End Learning-Based Approach&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;IEEE MultiMedia 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://dl.acm.org/doi/pdf/10.1145/3664647.3685519" target="_blank" rel="noopener"
&gt;OpenDIC: An Open-Source Library and Performance Evaluation for Deep-learning-based Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ACMMM 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id="others"&gt;Others
&lt;/h3&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th style="text-align: left"&gt;Title&lt;/th&gt;
&lt;th style="text-align: center"&gt;Pub. &amp;amp; Date&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2507.17221" target="_blank" rel="noopener"
&gt;Dataset Distillation as Data Compression: A Rate-Utility Perspective&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2025/papers/Zhang_Balanced_Rate-Distortion_Optimization_in_Learned_Image_Compression_CVPR_2025_paper.pdf" target="_blank" rel="noopener"
&gt;Balanced Rate-Distortion Optimization in Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;CVPR 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openreview.net/forum?id=olzs3zVsE7" target="_blank" rel="noopener"
&gt;Privacy-Shielded Image Compression: Defending Against Exploitation from Vision-Language Pretrained Models&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICML 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openreview.net/pdf?id=ialr09SfeJ" target="_blank" rel="noopener"
&gt;Synonymous Variational Inference for Perceptual Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICML 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ojs.aaai.org/index.php/AAAI/article/view/33111/35266" target="_blank" rel="noopener"
&gt;CAMSIC: Content-aware Masked Image Modeling Transformer for Stereo Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;AAAI 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2412.01646v1" target="_blank" rel="noopener"
&gt;Robust and Transferable Backdoor Attacks Against Deep Image Compression With Selective Frequency Prior&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2411.06810v1" target="_blank" rel="noopener"
&gt;JPEG AI Image Compression Visual Artifacts: Detection Methods and Dataset&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2411.16727v2" target="_blank" rel="noopener"
&gt;An Information-Theoretic Regularizer for Lossy Neural Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICCV 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2411.10650v1" target="_blank" rel="noopener"
&gt;Deep Learning-Based Image Compression for Wireless Communications: Impacts on Reliability, Throughput, and Latency&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://ieeexplore.ieee.org/document/10814661/" target="_blank" rel="noopener"
&gt;HNR-ISC: Hybrid Neural Representation for Image Set Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;TMM 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="http://arxiv.org/abs/2412.03261v1" target="_blank" rel="noopener"
&gt;Is JPEG AI going to change image forensics?&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://www.nowpublishers.com/article/OpenAccessDownload/SIP-20240025" target="_blank" rel="noopener"
&gt;2D Gaussian Splatting for Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ATSIP 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2410.20145" target="_blank" rel="noopener"
&gt;Cross-Platform Neural Video Coding: A Case Study&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;arXiv 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://openreview.net/pdf?id=zIrvyQdIG4" target="_blank" rel="noopener"
&gt;Gone With the Bits: Benchmarking Bias in Facial Phenotype Degradation Under Low-Rate Neural Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;ICMLW 2024&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style="text-align: left"&gt;&lt;a class="link" href="https://arxiv.org/pdf/2409.11111" target="_blank" rel="noopener"
&gt;Few-Shot Domain Adaptation for Learned Image Compression&lt;/a&gt;&lt;/td&gt;
&lt;td style="text-align: center"&gt;AAAI 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h2 id="2024"&gt;✔2024
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;(SPL 2024) &lt;strong&gt;OMR-NET: A Two-Stage Octave Multi-Scale Residual Network for Screen Content Image Compression&lt;/strong&gt; Jiang S, Ren T, Fu C, et al. &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10552293&amp;amp;casa_token=HZozj0vMXvkAAAAA:_7rf8zPrb-WjgI1-i9BoraOqIEMGQdTWcvj2NUfc-3GEtogq1VavMVzi2kKx8yF3hrNoAX6lfg&amp;amp;tag=1" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TPAMI 2024) &lt;strong&gt;I2C: Invertible Continuous Codec for High-Fidelity Variable-Rate Image Compression&lt;/strong&gt; Cai, Shilv and Chen, Liqun and Zhang, Zhijun and Zhao, Xiangyun and Zhou, Jiahuan and Peng, Yuxin and Yan, Luxin and Zhong, Sheng and Zou, Xu &lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10411123" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICASSP 2024) &lt;strong&gt;Leveraging Redundancy in Feature for Efficient Learned Image CompressionN&lt;/strong&gt; Qin, Peng and Bao, Youneng and Meng, Fanyang and Tan, Wen and Li, Chao and Wang, Genhong and Liang, Yongsheng &lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10447424" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICASSP 2024) &lt;strong&gt;RATE-QUALITY BASED RATE CONTROL MODEL FOR NEURAL VIDEO COMPRESSION&lt;/strong&gt; Liao, Shuhong and Jia, Chuanmin and Fan, Hongfei and Yan, Jingwen and Ma, Siwei &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;amp;arnumber=10447777" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICASSP 2024) &lt;strong&gt;Enhancing Adversarial Training with Prior Knowledge Distillation for Robust Image Compression&lt;/strong&gt; Zhi, Cao and Youneng, Bao and Fanyang, Meng and Chao, Li and Wen, Tan and Genhong, Wang and Yongsheng, Liang&lt;a class="link" href="https://arxiv.org/pdf/2403.06700v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(AAAI 2024) &lt;strong&gt;Make Lossy Compression Meaningful for Low-Light Images&lt;/strong&gt; Cai, Shilv and Chen, Liqun and Zhong, Sheng and Yan, Luxin and Zhou, Jiahuan and Zou, Xu &lt;a class="link" href="https://ojs.aaai.org/index.php/AAAI/article/download/28664/29289" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(AAAI 2024) &lt;strong&gt;End-to-End RGB-D Image Compression via Exploiting Channel-Modality Redundancy&lt;/strong&gt; Zheng, Huiming and Gao, Wei &lt;a class="link" href="https://ojs.aaai.org/index.php/AAAI/article/download/28588/29143" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2024) &lt;strong&gt;Towards Backward-Compatible Continual Learning of Image Compression&lt;/strong&gt; Duan, Zhihao and Lu, Ming and Yang, Justin and He, Jiangpeng and Ma, Zhan and Zhu, Fengqing &lt;a class="link" href="https://arxiv.org/pdf/2402.18862v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(NeurIPS 2024) &lt;strong&gt;Compression with bayesian implicit neural representations&lt;/strong&gt; Guo, Zongyu and Flamich, Gergely and He, Jiajun and Chen, Zhibo and Hern{'a}ndez-Lobato, Jos{'e} Miguel &lt;a class="link" href="https://arxiv.org/pdf/2305.19185.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TIP 2024) &lt;strong&gt;Bilateral Context Modeling for Residual Coding in Lossless 3D Medical Image Compression&lt;/strong&gt; Liu, Xiangrui and Wang, Meng and Wang, Shiqi and Kwong, Sam &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;amp;arnumber=10478821" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TMM 2024) &lt;strong&gt;Neural Network Coding of Difference Updates for Efficient Distributed Learning Communication&lt;/strong&gt; Sheng, Xihua and Li, Li and Liu, Dong and Li, Houqiang &lt;a class="link" href="https://arxiv.org/pdf/2401.15864.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TCSVT 2024) &lt;strong&gt;FICNet: An End to End Network for Free-view Image Coding&lt;/strong&gt; Yang, Chunhui and Yang, Jiayu and Zhai, Yongqi and Wang, Ronggang&lt;a class="link" href="https://ieeexplore.ieee.org/document/10504389?denied=" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TCSVT 2024) &lt;strong&gt;GroupedMixer: An Entropy Model with Group-wise Token-Mixers for Learned Image Compression&lt;/strong&gt; Li, Daxin and Bai, Yuanchao and Wang, Kai and Jiang, Junjun and Liu, Xianming and Gao, Wen &lt;a class="link" href="https://arxiv.org/pdf/2405.01170" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TCSVT 2024) &lt;strong&gt;Multirate Progressive Entropy Model for Learned Image Compression&lt;/strong&gt; Li, Chao and Yin, Shanzhi and Jia, Chuanmin and Meng, Fanyang and Tian, Yonghong and Liang, Yongsheng &lt;a class="link" href="https://ieeexplore.ieee.org/document/10471618" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TCSVT 2024) &lt;strong&gt;EUICN: An Efficient Underwater Image Compression Network&lt;/strong&gt; Li, Mengyao and Shen, Liquan and Hua, Xia and Tian, Zhaoyi &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;amp;arnumber=10445326" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TCSVT 2024) &lt;strong&gt;Rate-Distortion Optimized Cross Modal Compression with Multiple Domains&lt;/strong&gt; Gao, Junlong and Jia, Chuanmin and Huang, Zhimeng and Wang, Shanshe and Ma, Siwei and Gao, Wen &lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10430161" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ToMM 2024) &lt;strong&gt;Perceptual Quality-Oriented Rate Allocation via Distillation from End-to-End Image Compression&lt;/strong&gt; Yang, Runyu and Liu, Dong and Ma, Siwei and Wu, Feng and Gao, Wen &lt;a class="link" href="https://dl.acm.org/doi/pdf/10.1145/3650034" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TGRS 2024) &lt;strong&gt;Remote Sensing Image Compression Based on High-Frequency and Low-Frequency Components&lt;/strong&gt; Xiang, Shao and Liang, Qiaokang &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10379598&amp;amp;casa_token=8o7Rvla9bkIAAAAA:BdM70h2rnznpm8AjLpmF2OaaY4LOyj96msdVfnJyaYeQ-EVVWgoAz8YSFYoxbq2tG6L95AQr" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(WCACV 2024) &lt;strong&gt;Neural Image Compression Using Masked Sparse Visual Representation&lt;/strong&gt; Jiang, Wei and Wang, Wei and Chen, Yue &lt;a class="link" href="https://arxiv.org/pdf/2309.11661.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(PCS 2024) &lt;strong&gt;CoCliCo: Extremely low bitrate image compression based on CLIP semantic and tiny color map&lt;/strong&gt; Bachard, Tom and Bordin, Tom and Maugey, Thomas &lt;a class="link" href="https://inria.hal.science/hal-04478601/file/PCS_2024-2-1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(IEVC 2024) &lt;strong&gt;The Effect of Edge Information in Stable Diffusion Applied to Image Coding&lt;/strong&gt; Watanabe, Hiroshi and Chujoh, Takeshi and Fan, Zheming and Jin, Luoxu and Yasugi, Yukinobu and Ikai, Tomohiro and Hayami, Taiga and Hong, Sujun &lt;a class="link" href="https://www.ams.giti.waseda.ac.jp/data/pdf-files/2024IEVC_LBP-15.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(SPL 2024) &lt;strong&gt;Enhancing High-Resolution Image Compression Through Local-Global Joint Attention Mechanism&lt;/strong&gt;Jiang, Zeyu and Liu, Xiaohong and Li, Aini and Wang, Guangyu&lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10487886" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(SPL 2024) &lt;strong&gt;Learning-Based Image Compression With Parameter-Adaptive Rate-Constrained Loss&lt;/strong&gt;Guerin, Nilson D and da Silva, Renam Castro and Macchiavello, Bruno&lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10487041?casa_token=knUB41_TmBsAAAAA:a-OvI58YlhHCqICs5ondcAnowi-IGX2nx0TgWqjjp_VfILwGajk6aEbDfqpUAqvF6--XxzsqGQ" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICASSP 2024) &lt;strong&gt;Fine color guidance in diffusion models and its application to image compression at extremely low bitrates&lt;/strong&gt;Bordin, Tom and Maugey, Thomas&lt;a class="link" href="https://ieeexplore.ieee.org/document/10445837" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;On the Adversarial Robustness of Learning-based Image Compression Against Rate-Distortion Attacks&lt;/strong&gt;Wu, Chenhao and Wu, Qingbo and Wei, Haoran and Chen, Shuai and Wang, Lei and Ngan, King Ngi and Meng, Fanman and Li, Hongliang&lt;a class="link" href="https://arxiv.org/pdf/2405.07717" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Scalable Image Coding for Humans and Machines Using Feature Fusion Network&lt;/strong&gt;Li, Junhui and Hou, Xingsong&lt;a class="link" href="https://arxiv.org/pdf/2405.09152" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Towards Task-Compatible Compressible Representations&lt;/strong&gt; de Andrade, Anderson and Baji{'c}, Ivan&lt;a class="link" href="https://arxiv.org/pdf/2405.10244" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Enhancing Perception Quality in Remote Sensing Image Compression via Invertible Neural Network&lt;/strong&gt; Li, Junhui and Hou, Xingsong&lt;a class="link" href="https://arxiv.org/pdf/2405.10518" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;NLIC: Non-uniform Quantization based Learned Image Compression&lt;/strong&gt; Ge, Ziqing and Ma, Siwei and Gao, Wen and Pan, Jingshan and Jia, Chuanmin&lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10531761" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Domain Adaptation for Learned Image Compression with Supervised Adapters&lt;/strong&gt;Presta, Alberto and Spadaro, Gabriele and Tartaglione, Enzo and Fiandrotti, Attilio and Grangetto, Marco&lt;a class="link" href="https://arxiv.org/pdf/2404.15591" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;2D Gaussian Splatting for Image Compression&lt;/strong&gt;Pingping Zhang, Xiangrui Liu, Meng Wang, Shiqi Wang, Sam Kwong&lt;a class="link" href="https://github.com/ppingzhang/2DGS_ImageCompression/blob/main/2DGS_APSIPA.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Domain Adaptation for Learned Image Compression with Supervised Adapters&lt;/strong&gt;Presta, Alberto and Spadaro, Gabriele and Tartaglione, Enzo and Fiandrotti, Attilio and Grangetto, Marco&lt;a class="link" href="https://arxiv.org/pdf/2404.15591" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Towards Extreme Image Compression with Latent Feature Guidance and Diffusion Prior&lt;/strong&gt;Li, Zhiyuan and Zhou, Yanhui and Wei, Hao and Ge, Chenyang and Jiang, Jingwen&lt;a class="link" href="https://arxiv.org/pdf/2404.18820" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;S2LIC: Learned Image Compression with the SwinV2 Block, Adaptive Channel-wise and Global-inter Attention Context&lt;/strong&gt;Wang, Yongqiang and Liang, Feng and Liang, Jie and Fu, Haisheng&lt;a class="link" href="https://arxiv.org/pdf/2403.14471.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Lossy Image Compression with Foundation Diffusion Models&lt;/strong&gt;WRelic, Lucas and Azevedo, Roberto and Gross, Markus and Schroers, Christopher&lt;a class="link" href="https://arxiv.org/pdf/2404.08580.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Correcting Diffusion-Based Perceptual Image Compression with Privileged End-to-End Decoder&lt;/strong&gt;Ma, Yiyang and Yang, Wenhan and Liu, Jiaying&lt;a class="link" href="https://arxiv.org/html/2404.04916v1" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Fine color guidance in diffusion models and its application to image compression at extremely low bitrates&lt;/strong&gt;Bordin, Tom and Maugey, Thomas&lt;a class="link" href="https://arxiv.org/pdf/2404.06865.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Human-Machine Collaborative Image Compression Method Based on Implicit Neural Representations&lt;/strong&gt;Li, Huanyang and Zhang, Xinfeng&lt;a class="link" href="https://arxiv.org/pdf/2112.04267.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;CGenerative Refinement for Low Bitrate Image Coding Using Vector Quantized Residual&lt;/strong&gt;Kong, Yuzhuo and Lu, Ming and Ma, Zhan&lt;a class="link" href="https://ieeexplore.ieee.org/document/10493033?denied=" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Image and Video Compression using Generative Sparse Representation with Fidelity Controls&lt;/strong&gt;Jiang, Wei and Wang, Wei&lt;a class="link" href="https://arxiv.org/pdf/2404.06076.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Content-aware Masked Image Modeling Transformer for Stereo Image Compression&lt;/strong&gt;Zhang, Xinjie and Gao, Shenyuan and Liu, Zhening and Ge, Xingtong and He, Dailan and Xu, Tongda and Wang, Yan and Zhang, Jun&lt;a class="link" href="https://arxiv.org/pdf/2403.08505v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Super-High-Fidelity Image Compression via Hierarchical-ROI and Adaptive Quantization&lt;/strong&gt;Luo, Jixiang and Wang, Yan and Qin, Hongwei&lt;a class="link" href="https://arxiv.org/pdf/2403.13030.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Theoretical Bound-Guided Hierarchical VAE for Neural Image Codecs&lt;/strong&gt;Zhang, Yichi and Duan, Zhihao and Huang, Yuning and Zhu, Fengqing&lt;a class="link" href="https://arxiv.org/pdf/2403.18535v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Unifying Generation and Compression: Ultra-low bitrate Image Coding Via Multi-stage Transformer&lt;/strong&gt;Xue, Naifu and Mao, Qi and Wang, Zijian and Zhang, Yuan and Ma, Siwei&lt;a class="link" href="https://arxiv.org/pdf/2403.03736.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Enhancing the Rate-Distortion-Perception Flexibility of Learned Image Codecs with Conditional Diffusion Decoders&lt;/strong&gt;Mari, Daniele and Milani, Simone&lt;a class="link" href="https://arxiv.org/pdf/2403.02887v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Channel-wise Feature Decorrelation for Enhanced Learned Image Compression&lt;/strong&gt;Pakdaman, Farhad and Gabbouj, Moncef&lt;a class="link" href="https://arxiv.org/pdf/2403.10936.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Overfitted image coding at reduced complexity&lt;/strong&gt;Blard, Th{'e}ophile and Ladune, Th{'e}o and Philippe, Pierrick and Clare, Gordon and Jiang, Xiaoran and D{'e}forges, Olivier&lt;a class="link" href="https://arxiv.org/pdf/2403.11651v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Neural Image Compression with Text-guided Encoding for both Pixel-level and Perceptual Fidelity&lt;/strong&gt; Lee, Hagyeong and Kim, Minkyu and Kim, Jun-Hyuk and Kim, Seungeon and Oh, Dokwan and Lee, Jaeho&lt;a class="link" href="https://arxiv.org/pdf/2403.02944.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Transformer-based Learned Image Compression for Joint Decoding and Denoising&lt;/strong&gt; Chen, Yi-Hsin and Ho, Kuan-Wei and Tsai, Shiau-Rung and Lin, Guan-Hsun and Gnutti, Alessandro and Peng, Wen-Hsiao and Leonardi, Riccardo&lt;a class="link" href="https://arxiv.org/pdf/2402.12888v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Channel-wise Feature Decorrelation for Enhanced Learned Image Compression&lt;/strong&gt; Pakdaman, Farhad and Gabbouj, Moncef&lt;a class="link" href="https://arxiv.org/ftp/arxiv/papers/2403/2403.10936.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Super-High-Fidelity Image Compression via Hierarchical-ROI and Adaptive Quantization&lt;/strong&gt; Luo, Jixiang and Wang, Yan and Qin, Hongwei&lt;a class="link" href="https://arxiv.org/pdf/2403.13030.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;S2LIC: Learned Image Compression with the SwinV2 Block, Adaptive Channel-wise and Global-inter Attention Context&lt;/strong&gt; Wang, Yongqiang and Liang, Feng and Liang, Jie and Fu, Haisheng&lt;a class="link" href="https://arxiv.org/pdf/2403.14471.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Content-aware Masked Image Modeling Transformer for Stereo Image Compression&lt;/strong&gt; Zhang, Xinjie and Gao, Shenyuan and Liu, Zhening and Ge, Xingtong and He, Dailan and Xu, Tongda and Wang, Yan and Zhang, Jun&lt;a class="link" href="https://arxiv.org/pdf/2403.08505v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Probing Image Compression For Class-Incremental Learning&lt;/strong&gt; Yang, Justin and Duan, Zhihao and Peng, Andrew and Huang, Yuning and He, Jiangpeng and Zhu, Fengqing&lt;a class="link" href="https://arxiv.org/pdf/2403.06288.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Variable-Rate Learned Image Compression with Multi-Objective Optimization and Quantization-Reconstruction Offsets&lt;/strong&gt; Kamisli, Fatih and Racape, Fabien and Choi, Hyomin &lt;a class="link" href="https://arxiv.org/pdf/2402.18930v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Channel-wise Feature Decorrelation for Enhanced Learned Image Compression&lt;/strong&gt; Pakdaman, Farhad and Gabbouj, Moncef &lt;a class="link" href="https://arxiv.org/ftp/arxiv/papers/2403/2403.10936.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Exploration of Learned Lifting-Based Transform Structures for Fully Scalable and Accessible Wavelet-Like Image Compression&lt;/strong&gt; Li, Xinyue and Naman, Aous and Taubman, David &lt;a class="link" href="https://arxiv.org/pdf/2402.18761v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Powerful Lossy Compression for Noisy Images&lt;/strong&gt; Cai, Shilv and Liang, Xiaoguo and Cao, Shuning and Yan, Luxin and Zhong, Sheng and Chen, Liqun and Zou, Xu &lt;a class="link" href="https://arxiv.org/pdf/2403.14135v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Enhancing Adversarial Training with Prior Knowledge Distillation for Robust Image Compression&lt;/strong&gt; Zhi, Cao and Youneng, Bao and Fanyang, Meng and Chao, Li and Wen, Tan and Genhong, Wang and Yongsheng, Liang &lt;a class="link" href="https://arxiv.org/pdf/2403.06700v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Image Coding for Machines with Edge Information Learning Using Segment Anything&lt;/strong&gt; Shindo, Takahiro and Yamada, Kein and Watanabe, Taiju and Watanabe, Hiroshi &lt;a class="link" href="https://arxiv.org/pdf/2403.04173v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Resilience of Entropy Model in Distributed Neural Networks&lt;/strong&gt; Zhang, Milin and Abdi, Mohammad and Rifat, Shahriar and Restuccia, Francesco&lt;a class="link" href="https://arxiv.org/pdf/2403.00942v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Unifying Generation and Compression: Ultra-low bitrate Image Coding Via Multi-stage Transformer&lt;/strong&gt; Xue, Naifu and Mao, Qi and Wang, Zijian and Zhang, Yuan and Ma, Siwei&lt;a class="link" href="https://arxiv.org/pdf/2403.03736.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Content-aware Masked Image Modeling Transformer for Stereo Image Compression&lt;/strong&gt; Zhang, Xinjie and Gao, Shenyuan and Liu, Zhening and Ge, Xingtong and He, Dailan and Xu, Tongda and Wang, Yan and Zhang, Jun&lt;a class="link" href="https://arxiv.org/pdf/2403.08505v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;GaussianImage: 1000 FPS Image Representation and Compression by 2D Gaussian Splatting&lt;/strong&gt; Zhang, Xinjie and Ge, Xingtong and Xu, Tongda and He, Dailan and Wang, Yan and Qin, Hongwei and Lu, Guo and Geng, Jing and Zhang, Jun&lt;a class="link" href="https://arxiv.org/pdf/2403.08551v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Wavelet-Like Transform-Based Technology in Response to the Call for Proposals on Neural Network-Based Image Coding&lt;/strong&gt; Dong, Cunhui and Ma, Haichuan and Zhang, Haotian and Gao, Changsheng and Li, Li and Liu, Dong&lt;a class="link" href="https://arxiv.org/pdf/2403.05937v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Region-Adaptive Transform with Segmentation Prior for Image Compression&lt;/strong&gt; Liu, Yuxi and Yang, Wenhan and Bai, Huihui and Wei, Yunchao and Zhao, Yao&lt;a class="link" href="https://arxiv.org/pdf/2403.00628.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;LEARNED IMAGE COMPRESSION WITH TEXT QUALITY ENHANCEMENT&lt;/strong&gt; Lai, Chih-Yu and Tran, Dung and Koishida, Kazuhito&lt;a class="link" href="https://arxiv.org/pdf/2402.08643.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Variable-Rate Learned Image Compression with Multi-Objective Optimization and Quantization-Reconstruction Offsets&lt;/strong&gt; Kamisli, Fatih and Racape, Fabien and Choi, Hyomin&lt;a class="link" href="https://arxiv.org/pdf/2402.18930v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;End-to-End Optimized Image Compression with the Frequency-Oriented Transform&lt;/strong&gt; Zhang, Yuefeng and Lin, Kai &lt;a class="link" href="https://arxiv.org/pdf/2401.08194.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Learned Image Compression with ROI-Weighted Distortion and Bit Allocation&lt;/strong&gt; Jiang, Wei and Zhai, Yongqi and Li, Hangyu and Wang, Ronggang &lt;a class="link" href="https://arxiv.org/pdf/2401.08154.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Semantic Ensemble Loss and Latent Refinement for High-Fidelity Neural Image Compression&lt;/strong&gt; Li, Daxin and Bai, Yuanchao and Wang, Kai and Jiang, Junjun and Liu, Xianming &lt;a class="link" href="https://arxiv.org/pdf/2401.14007.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;FLLIC: Functionally Lossless Image Compression&lt;/strong&gt; Zhang, Xi and Wu, Xiaolin &lt;a class="link" href="https://arxiv.org/pdf/2401.13616.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Fast Implicit Neural Representation Image Codec in Resource-limited Devices&lt;/strong&gt; Liu, Xiang and Chen, Jiahong and Chen, Bin and Liu, Zimo and An, Baoyi and Xia, Shu-Tao &lt;a class="link" href="https://arxiv.org/pdf/2401.12587.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(NeurPIS 2024) &lt;strong&gt;Robustly overfitting latents for flexible neural image compression&lt;/strong&gt; Perugachi-Diaz, Yura and Gansekoele, Arwin and Bhulai, Sandjai &lt;a class="link" href="https://arxiv.org/pdf/2401.17789.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Saliency-aware End-to-end Learned Variable-Bitrate 360-degree Image Compression&lt;/strong&gt; Gungordu, Oguzhan and Tekalp, A Murat &lt;a class="link" href="https://arxiv.org/pdf/2402.08862.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Joint End-to-End Image Compression and Denoising: Leveraging Contrastive Learning and Multi-Scale Self-ONNs&lt;/strong&gt; Xie, Yuxin and Yu, Li and Pakdaman, Farhad and Gabbouj, Moncef&lt;a class="link" href="https://arxiv.org/pdf/2402.05582.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;LEARNED COMPRESSION OF ENCODING DISTRIBUTIONS&lt;/strong&gt; Ulhaq, Mateen and Bajic, Ivan V&lt;a class="link" href="https://www.sfu.ca/~mulhaq/assets/pdf/2024-icip-learned-compression-of-encoding-distributions.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2024) &lt;strong&gt;Transformer-based Learned Image Compression for Joint Decoding and Denoising&lt;/strong&gt; Chen, Yi-Hsin and Ho, Kuan-Wei and Tsai, Shiau-Rung and Lin, Guan-Hsun and Gnutti, Alessandro and Peng, Wen-Hsiao and Leonardi, Riccardo&lt;a class="link" href="https://arxiv.org/pdf/2402.12888v1.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(VICP 2024) &lt;strong&gt;Flexible Coding Order for Learned Image Compression&lt;/strong&gt; Li, Yuqi and Zhang, Haotian and Liu, Dong &lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10402631" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(VICP 2024) &lt;strong&gt;Variable-rate Learned Image Compression with Adaptive Quantization Step Size&lt;/strong&gt; Mei, Feihong and Li, Li and Liu, Dong &lt;a class="link" href="https://ieeexplore.ieee.org/stampPDF/getPDF.jsp?tp=&amp;amp;arnumber=10402767&amp;amp;ref=" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(VICP 2024) &lt;strong&gt;Learned Progressive Image Compression With Spatial Autoregression&lt;/strong&gt; Li, Hangyu and Jiang, Wei and Li, Litian and Zhai, Yongqi and Wang, Ronggang &lt;a class="link" href="https://ieeexplore.ieee.org/stampPDF/getPDF.jsp?tp=&amp;amp;arnumber=10402651&amp;amp;ref=" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(VICP 2024) &lt;strong&gt;Hybrid Implicit Neural Image Compression with Subpixel Context Model and Iterative Pruner&lt;/strong&gt; Tian, Wenxin and Li, Shaohui and Dai, Wenrui and Lu, Cewu and Hu, Weisheng and Zhang, Lin and Du, Junfeng and Xiong, Hongkai &lt;a class="link" href="https://ieeexplore.ieee.org/stampPDF/getPDF.jsp?tp=&amp;amp;arnumber=10402791&amp;amp;ref=" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(VICP 2024) &lt;strong&gt;Learned Progressive Image Compression With Spatial Autoregression&lt;/strong&gt; Tian, Wenxin and Li, Shaohui and Dai, Wenrui and Lu, Cewu and Hu, Weisheng and Zhang, Lin and Du, Junfeng and Xiong, Hongkai &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;amp;arnumber=10402767" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="2023"&gt;✔2023
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;(NeurIPS 2023) &lt;strong&gt;Towards efficient image compression without autoregressive models&lt;/strong&gt; Ali, Muhammad Salman and Kim, Yeongwoong and Qamar, Maryam and Lim, Sung-Chang and Kim, Donghyun and Zhang, Chaoning and Bae, Sung-Ho and Kim, Hui Yong &lt;a class="link" href="https://openreview.net/pdf?id=1ihGy9vAIg" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(NeurIPS 2023) &lt;strong&gt;LUT-LIC: Look-Up Table-Assisted Learned Image Compression&lt;/strong&gt; Yu, SeungEun and Lee, Jong-Seok&lt;a class="link" href="https://link.springer.com/chapter/10.1007/978-981-99-8148-9_34" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ACMMM 2023) &lt;strong&gt;Toward Scalable Image Feature Compression: A Content-Adaptive and Diffusion-Based Approach&lt;/strong&gt; Guo, Sha and Chen, Zhuo and Zhao, Yang and Zhang, Ning and Li, Xiaotong and Duan, Lingyu &lt;a class="link" href="https://dl.acm.org/doi/pdf/10.1145/3581783.3611851?casa_token=mNmCMwSt2NcAAAAA:pYJtS3-8nkQdv-d0hp5N3OptJqtnjFcfBNOohVR0SqCbdP9mF4tFuAZEN5_WiTkVaxttfYUdfyqJHw" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ACMMM 2023) &lt;strong&gt;Nif: A fast implicit image compression with bottleneck layers and modulated sinusoidal activations&lt;/strong&gt; Catania, Lorenzo and Allegra, Dario &lt;a class="link" href="https://dl.acm.org/doi/pdf/10.1145/3581783.3613834" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ACMMM 2023) &lt;strong&gt;Lambda-Domain Rate Control for Neural Image Compression&lt;/strong&gt; Xue, Naifu and Zhang, Yuan &lt;a class="link" href="https://dl.acm.org/doi/pdf/10.1145/3595916.3626372?casa_token=ZQoUWGi2J6UAAAAA:3NWoCPBC-hhmWmMgcu3uPf_UFg0eSN3fLoeBi_8S0GKRJaW78mnXjkxBesKBwfe30nzHI0PEXGfAVQ" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ACMMM 2023) &lt;strong&gt;MLIC: Multi-Reference Entropy Model for Learned Image Compression&lt;/strong&gt; Jiang, Wei and Yang, Jiayu and Zhai, Yongqi and Ning, Peirong and Gao, Feng and Wang, Ronggang &lt;a class="link" href="https://dl.acm.org/doi/pdf/10.1145/3581783.3611694" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ACMMM 2023) &lt;strong&gt;ELFIC: A Learning-based Flexible Image Codec with Rate-Distortion-Complexity Optimization&lt;/strong&gt; Zhang, Zhichen and Chen, Bolin and Lin, Hongbin and Lin, Jielian and Wang, Xu and Zhao, Tiesong &lt;a class="link" href="https://dl.acm.org/doi/pdf/10.1145/3581783.3612540" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ACMMM 2023) &lt;strong&gt;ICMH-Net: Neural Image Compression Towards both Machine Vision and Human Vision&lt;/strong&gt; Liu, Lei and Hu, Zhihao and Chen, Zhenghao and Xu, Dong &lt;a class="link" href="https://dl.acm.org/doi/pdf/10.1145/3581783.3612041?casa_token=S1tEOBghRlUAAAAA:3QJByYZssGAMLB6Yloy9eCwEEkI7RrZQ_kuaJfIjBCaWH45RJomJC4uQN1StEi_UplaboXcyaEASvA" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TIP 2023) &lt;strong&gt;Learned Image Compression Using Cross-Component Attention Mechanism&lt;/strong&gt; Duan, Wenhong and Chang, Zheng and Jia, Chuanmin and Wang, Shanshe and Ma, Siwei and Song, Li and Gao, Wen &lt;a class="link" href="https://ieeexplore.ieee.org/document/10268865/" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TIP 2023) &lt;strong&gt;Scalable Face Image Coding via StyleGAN Prior: Towards Compression for Human-Machine Collaborative Vision&lt;/strong&gt; Mao, Qi and Wang, Chongyu and Wang, Meng and Wang, Shiqi and Chen, Ruijie and Jin, Libiao and Ma, Siwei &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10372532&amp;amp;casa_token=tefNsn9cqyIAAAAA:iNI1vVcH9m8rW3GLAj-yB_6FC_eiNBGUUiIzVaAlYC7JHRxGElmSd1MdVYHKD0P-9FtPMq5aEw" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2023) &lt;strong&gt;Dec-Adapter: Exploring Efficient Decoder-Side Adapter for Bridging Screen Content and Natural Image Compression&lt;/strong&gt; Shen, Sheng and Yue, Huanjing and Yang, Jingyu &lt;a class="link" href="https://openaccess.thecvf.com/content/ICCV2023/papers/Shen_Dec-Adapter_Exploring_Efficient_Decoder-Side_Adapter_for_Bridging_Screen_Content_and_ICCV_2023_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2023) &lt;strong&gt;Context-Based Trit-Plane Coding for Progressive Image Compression&lt;/strong&gt; Jeon, Seungmin and Choi, Kwang Pyo and Park, Youngo and Kim, Chang-Su &lt;a class="link" href="https://arxiv.org/pdf/2303.05715.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2023) &lt;strong&gt;Transtic: Transferring transformer-based image compression from human perception to machine perception&lt;/strong&gt; Chen, Yi-Hsin and Weng, Ying-Chieh and Kao, Chia-Hao and Chien, Cheng and Chiu, Wei-Chen and Peng, Wen-Hsiao &lt;a class="link" href="https://openaccess.thecvf.com/content/ICCV2023/papers/Chen_TransTIC_Transferring_Transformer-based_Image_Compression_from_Human_Perception_to_Machine_ICCV_2023_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TAI 2023) &lt;strong&gt;Manipulation Attacks on Learned Image Compression&lt;/strong&gt; Liu, Kang and Wu, Di and Wu, Yangyu and Wang, Yiru and Feng, Dan and Tan, Benjamin and Garg, Siddharth&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10352982&amp;amp;casa_token=7J9wZTEfvZUAAAAA:A4rT0GYrKkWQ8h1hhnQxyazt_2kunYTDE1vn73nQD5RDms-6eoJ_ZUppgHNr3WTBk143oCWW6Q" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;A Decoupled Spatial-Channel Inverted Bottleneck For Image Compression&lt;/strong&gt; Hu, Yuting and Tan, Wen and Meng, Fanyang and Liang, Yongsheng&lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10222366" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;NUCQ: Non-Uniform Conditional Quantization for Learned Image Compression&lt;/strong&gt; Ge, Ziqing and Jia, Chuanmin and Ma, Siwei and Gao, We&lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10222198" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;End-to-End Learning-based Image Compression with A Decoupled Framework&lt;/strong&gt; Zhang, Zhaobin and Esenlik, Semih and Wu, Yaojun and Wang, Meng and Zhang, Kai and Zhang, Li&lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10247017/metrics#metrics" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;Advancing the Rate-Distortion-Computation Frontier for Neural Image Compression&lt;/strong&gt; Minnen, David and Johnston, Nick&lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10222381" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;Efficient Pruning Method for Learned Lossy Image Compression Models Based on Side Information&lt;/strong&gt; LChen, Weixuan and Yang, Qianqian&lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10222822" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;Content-Adaptive Parallel Entropy Coding for End-to-End Image Compression&lt;/strong&gt; Li, Shujia and Wang, Dezhao and Fan, Zejia and Liu, Jiaying&lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10222067" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;Edge-Guided Remote-Sensing Image Compression&lt;/strong&gt; Han, Pengfei and Zhao, Bin and Li, Xuelong &lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10247080" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;Learned Image Compression Guided Adaptive Quantization for Perceptual Quality&lt;/strong&gt; Chen, Cheng and Geng, Ruiqi and Li, Bohan and Ustarroz-Calonge, Maryla and Galligan, Frank and Han, Jingning and Xu, Yaowu &lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10222637" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;Unified Learning-Based Lossy and Lossless Jpeg Recompression&lt;/strong&gt; J. Zhang et al. &lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10222354" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;ULcompress: A Unified low bit-rate image Compression Framework via Invertible Image Representation&lt;/strong&gt; F. Gao, X. Deng, C. Gao and M. Xu &lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10222242" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;Learned Image Compression with Multi-Scan Based Channel Fusion&lt;/strong&gt; Y. Li, W. Zhou, P. Lu and S. -i. Kamata, &lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10222127" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;Integer Quantized Learned Image Compression&lt;/strong&gt; G. -W. Jeon, S. Yu and J. -S. Lee &lt;a class="link" href="https://ieeexplore.ieee.org/document/10222336" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;A Decoupled Spatial-Channel Inverted Bottleneck For Image Compression&lt;/strong&gt; Hu, Yuting and Tan, Wen and Meng, Fanyang and Liang, Yongsheng &lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10222381" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;Learned Image Compression with Large Capacity and Low Redundancy of Latent Representation&lt;/strong&gt; Meng, Xiandong and Zhu, Shuyuan and Ma, Siwei and Zeng, Bing &lt;a class="link" href="https://ieeexplore.ieee.org/document/10222366" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;An Improved Upper Bound on the Rate-Distortion Function of Images&lt;/strong&gt; Duan, Zhihao and Ma, Jack and He, Jiangpeng and Zhu, Fengqing&lt;a class="link" href="https://arxiv.org/pdf/2309.02574.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2023) &lt;strong&gt;AICT: An Adaptive Image Compression Transformer&lt;/strong&gt; Ghorbel, Ahmed and Hamidouche, Wassim and Morin, Luce&lt;a class="link" href="https://arxiv.org/pdf/2307.06091.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(WACV 2023) &lt;strong&gt;Neural Distributed Image Compression with Cross-Attention Feature Alignment&lt;/strong&gt; Mital, Nitish and {&amp;quot;O}zyilkan, Ezgi and Garjani, Ali and G{&amp;quot;u}nd{&amp;quot;u}z, Deniz&lt;a class="link" href="https://openaccess.thecvf.com/content/WACV2023/papers/Mital_Neural_Distributed_Image_Compression_With_Cross-Attention_Feature_Alignment_WACV_2023_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(VCIP 2023) &lt;strong&gt;Image Data Hiding in Neural Compressed Latent Representations&lt;/strong&gt; Huang, Chen-Hsiu and Wu, Ja-Ling&lt;a class="link" href="https://ieeexplore.ieee.org/document/10402627" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(VICP 2023) &lt;strong&gt;EVC: TOWARDS REAL-TIME NEURAL IMAGE COMPRESSION WITH MASK DECA&lt;/strong&gt; Wang, Guo-Hua and Li, Jiahao and Li, Bin and Lu, Yan &lt;a class="link" href="https://arxiv.org/pdf/2302.05071.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(VICP 2023) &lt;strong&gt;A Near Lossless Learned Image Coding Network Quantization Approach for Cross-Platform Inference&lt;/strong&gt; Hang, Xinyu and Jia, Chuanmin and Ma, Siwei and Gao, Wen &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10402704&amp;amp;casa_token=SpFz9g7TeT8AAAAA:GNVUj1Qv03LvWGp3bF9iyCSr_-ZLx6-HNZM4vxYXFqs_yTFitBKet3htVPIc1LR4uKboCvnL" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICASSP 2023) &lt;strong&gt;A Novel Cross-Component Context Model for End-to-End Wavelet Image Coding&lt;/strong&gt; Meyer, Anna and Kaup, Andr{'e} &lt;a class="link" href="https://arxiv.org/pdf/2303.05121.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TCSVT 2024) &lt;strong&gt;Lightweight Context Model Equipped aiWave in Response to the AVS Call for Evidence on Volumetric Medical Image Coding&lt;/strong&gt; Xue, Dongmei and Li, Li and Liu, Dong and Li, Houqiang &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10453226" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TCSVT 2023) &lt;strong&gt;MASIC: Deep Mask Stereo Image Compression&lt;/strong&gt; Deng, Xin and Deng, Yufan and Yang, Ren and Yang, Wenzhe and Timofte, Radu and Xu, Mai &lt;a class="link" href="https://scholar.google.com/scholar_url?url=https://ieeexplore.ieee.org/iel7/76/4358651/10061473.pdf%3Fcasa_token%3DyxaR8FAUmccAAAAA:NZVDcw8yyjkyl1jR53FSSfUBKSAUxSgFwjNl6n3E3gjtklYQ7e6KLBD0sY9rtdPDj3cMxRyjb3w&amp;amp;hl=zh-CN&amp;amp;sa=T&amp;amp;oi=ucasa&amp;amp;ct=ucasa&amp;amp;ei=C7HBZbBn6NLL1g_hqbWgCw&amp;amp;scisig=AFWwaeauyyBtBEhlO7xzS3SgL_l_" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TCSVT 2023) &lt;strong&gt;Extremely Low Bit-rate Image Compression via Invertible Image Generation&lt;/strong&gt; Gao, Fangyuan and Deng, Xin and Jing, Junpeng and Zou, Xin and Xu, Mai &lt;a class="link" href="https://ieeexplore.ieee.org/document/10256132" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TCSVT 2023) &lt;strong&gt;Task-Switchable Pre-Processor for Image Compression for Multiple Machine Vision Tasks&lt;/strong&gt; Yang, Mingyi and Yang, Fei and Murn, Luka and Blanch, Marc Gorriz and Sock, Juil and Wan, Shuai and Yang, Fuzheng and Herranz, Luis &lt;a class="link" href="https://ieeexplore.ieee.org/document/10256132" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TCSVT 2023) &lt;strong&gt;Rethinking semantic image compression: Scalable representation with cross-modality transfer&lt;/strong&gt; Zhang, Pingping and Wang, Shiqi and Wang, Meng and Li, Jiguo and Wang, Xu and Kwong, Sam &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10032603&amp;amp;casa_token=jUWiQNkyzn4AAAAA:sB3n5iqEj4xbTgiOrrXxsI5lbXizq0V9wxvkaZ71ik2nPah0yHZ8WzHwbkrp-URvTMuHukK3&amp;amp;tag=1" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TCSVT 2023) &lt;strong&gt;Facial Image Compression via Neural Image Manifold Compression&lt;/strong&gt; Yang, Wenhan and Huang, Haofeng and Liu, Jiaying and Kot, Alex C. &lt;a class="link" href="https://ieeexplore.ieee.org/abstract/document/10122667" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TCSVT 2023) &lt;strong&gt;Sketch Assisted Face Image Coding for Human and Machine Vision: a Joint Training Approach.&lt;/strong&gt; Fang, Xin and Duan, Yiping and Du, Qiyuan and Tao, Xiaoming and Li, Fan &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10082973&amp;amp;casa_token=bXnEBK4JjLcAAAAA:JO0euK8CEhYZUGE70J9G-3WUZVOVeh5DkXdHQRnWQCSrgg4ybixUxy1J0tFCcYyZWvvggncp" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICCV 2023) &lt;strong&gt;COMPASS: High-Efficiency Deep Image Compression with Arbitrary-scale Spatial Scalability&lt;/strong&gt; Park, Jongmin and Lee, Jooyoung and Kim, Munchurl &lt;a class="link" href="https://openaccess.thecvf.com/content/ICCV2023/papers/Park_COMPASS_High-Efficiency_Deep_Image_Compression_with_Arbitrary-scale_Spatial_Scalability_ICCV_2023_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICCV 2023) &lt;strong&gt;AdaNIC: Towards Practical Neural Image Compression via Dynamic Transform Routing&lt;/strong&gt; Tao, Lvfang and Gao, Wei and Li, Ge and Zhang, Chenhao &lt;a class="link" href="https://openaccess.thecvf.com/content/ICCV2023/papers/Tao_AdaNIC_Towards_Practical_Neural_Image_Compression_via_Dynamic_Transform_Routing_ICCV_2023_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(WCACV 2023) &lt;strong&gt;Controlling Rate, Distortion, and Realism: Towards a Single Comprehensive Neural Image Compression Model&lt;/strong&gt; Iwai, Shoma and Miyazaki, Tomo and Omachi, Shinichiro &lt;a class="link" href="https://openaccess.thecvf.com/content/WACV2024/papers/Iwai_Controlling_Rate_Distortion_and_Realism_Towards_a_Single_Comprehensive_Neural_WACV_2024_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2023) &lt;strong&gt;EGIC: Enhanced Low-Bit-Rate Generative Image Compression Guided by Semantic Segmentation&lt;/strong&gt; K{&amp;quot;o}rber, Nikolai and Kromer, Eduard and Siebert, Andreas and Hauke, Sascha and Mueller-Gritschneder, Daniel &lt;a class="link" href="https://arxiv.org/pdf/2309.03244.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2023) &lt;strong&gt;A Training-Free Defense Framework for Robust Learned Image Compression&lt;/strong&gt; Song, Myungseo and Choi, Jinyoung and Han, Bohyung &lt;a class="link" href="https://arxiv.org/pdf/2401.11902.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2023) &lt;strong&gt;FFCA-Net: Stereo Image Compression via Fast Cascade Alignment of Side Information&lt;/strong&gt;Xia, Yichong and Huang, Yujun and Chen, Bin and Wang, Haoqian and Wang, Yaowei&lt;a class="link" href="https://arxiv.org/pdf/2312.16963.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2023) &lt;strong&gt;Another Way to the Top: Exploit Contextual Clustering in Learned Image Coding&lt;/strong&gt;Zhang, Yichi and Duan, Zhihao and Lu, Ming and Ding, Dandan and Zhu, Fengqing and Ma, Zhan&lt;a class="link" href="https://arxiv.org/pdf/2401.11615.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2023) &lt;strong&gt;Attack and Defense Analysis of Learned Image Compression&lt;/strong&gt;Zhu, Tianyu and Sun, Heming and Xiong, Xiankui and Zhu, Xuanpeng and Gong, Yong and Fan, Yibo and others&lt;a class="link" href="https://arxiv.org/pdf/2401.10345.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2023) &lt;strong&gt;Fast and High-Performance Learned Image Compression With Improved Checkerboard Context Model, Deformable Residual Module, and Knowledge Distillation&lt;/strong&gt; Fu, Haisheng and Liang, Feng and Liang, Jie and Wang, Yongqiang and Zhang, Guohe and Han, Jingning &lt;a class="link" href="https://arxiv.org/pdf/2309.02529.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2023) &lt;strong&gt;Multi-Context Dual Hyper-Prior Neural Image Compression&lt;/strong&gt; Khoshkhahtinat, Atefeh and Zafari, Ali and Mehta, Piyush M and Akyash, Mohammad and Kashiani, Hossein and Nasrabadi, Nasser M &lt;a class="link" href="https://arxiv.org/pdf/2309.10799.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2023) &lt;strong&gt;On Uniform Scalar Quantization for Learned Image Compression&lt;/strong&gt; Zhang, Haotian and Li, Li and Liu, Dong&lt;a class="link" href="https://arxiv.org/pdf/2309.17051.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2023) &lt;strong&gt;Frequency-Aware Transformer for Learned Image Compression&lt;/strong&gt; Li, Han and Li, Shaohui and Dai, Wenrui and Li, Chenglin and Zou, Junni and Xiong, Hongkai&lt;a class="link" href="https://arxiv.org/pdf/2310.16387.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2023) &lt;strong&gt;Perceptual Image Compression with Cooperative Cross-Modal Side Information&lt;/strong&gt; Qin, Shiyu and Chen, Bin and Huang, Yujun and An, Baoyi and Dai, Tao and Via, Shu-Tao&lt;a class="link" href="https://arxiv.org/pdf/2311.13847.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2023) &lt;strong&gt;Progressive Learning with Visual Prompt Tuning for Variable-Rate Image Compression&lt;/strong&gt; Qin, Shiyu and Zhou, Yimin and Wang, Jinpeng and Chen, Bin and An, Baoyi and Dai, Tao and Xia, Shu-Tao&lt;a class="link" href="https://arxiv.org/pdf/2311.17350.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2023) &lt;strong&gt;Exploring the Rate-Distortion-Complexity Optimization in Neural Image Compression&lt;/strong&gt; Gao, Yixin and Feng, Runsen and Guo, Zongyu and Chen, Zhibo&lt;a class="link" href="https://arxiv.org/pdf/2305.07678.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(JVCIR 2023) &lt;strong&gt;Corner-to-Center long-range context model for efficient learned image compression&lt;/strong&gt; LSui, Yang and Ding, Ding and Pan, Xiang and Xu, Xiaozhong and Liu, Shan and Yuan, Bo and Chen, Zhenzhong&lt;a class="link" href="https://arxiv.org/pdf/2311.18103.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(VICP 2023) &lt;strong&gt;A Near Lossless Learned Image Coding Network Quantization Approach for Cross-Platform Inference&lt;/strong&gt; Hang, Xinyu and Jia, Chuanmin and Ma, Siwei and Gao, Wen &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10402704&amp;amp;casa_token=SpFz9g7TeT8AAAAA:GNVUj1Qv03LvWGp3bF9iyCSr_-ZLx6-HNZM4vxYXFqs_yTFitBKet3htVPIc1LR4uKboCvnL" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TMI 2023) &lt;strong&gt;A Near Lossless Learned Image Coding Network Quantization Approach for Cross-Platform Inference&lt;/strong&gt; Hang, Xinyu and Jia, Chuanmin and Ma, Siwei and Gao, Wen &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10402704&amp;amp;casa_token=SpFz9g7TeT8AAAAA:GNVUj1Qv03LvWGp3bF9iyCSr_-ZLx6-HNZM4vxYXFqs_yTFitBKet3htVPIc1LR4uKboCvnL" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="2022"&gt;✔2022
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;(PCS 2022) &lt;strong&gt;Reducing The Amortization Gap of Entropy Bottleneck In End-to-End Image Compression&lt;/strong&gt; Balcilar, Muhammet and Damodaran, Bharath and Hellier, Pierre &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10018064&amp;amp;casa_token=T3OEyA4gC_UAAAAA:hV74ZEkQEKKE940LsRyDFRFIhIQcATSnQKZsc8mTr2UTT6jLIMAyBijHG1pTfFJG-8VxRRn7XuA" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt; e&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR workshop 2022) &lt;strong&gt;Self-Supervised Variable Rate Image Compression using Visual Attention&lt;/strong&gt; Sinha, Abhishek Kumar and Moorthi, S Manthira and Dhar, Debajyoti&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022W/CLIC/papers/Sinha_Self-Supervised_Variable_Rate_Image_Compression_Using_Visual_Attention_CVPRW_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR workshop 2022) &lt;strong&gt;RDONet: Rate-Distortion Optimized Learned Image Compression with Variable Depth&lt;/strong&gt; Brand, Fabian and Fischer, Kristian and Kopte, Alexander and Windsheimer, Marc and Kaup, Andr{'e} &lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022W/CLIC/papers/Brand_RDONet_Rate-Distortion_Optimized_Learned_Image_Compression_With_Variable_Depth_CVPRW_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2022) &lt;strong&gt;Transformations in Learned Image Compression from Modulation Perspective&lt;/strong&gt; Bao, Youneng and Meng, Fangyang and Tan, Wen and Li, Chao and Tian, Yonghong and Liang, Yongsheng &lt;a class="link" href="https://arxiv.org/pdf/2203.02158.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2022) &lt;strong&gt;Flexible Neural Image Compression via Code Editing&lt;/strong&gt; Gao, Chenjian and Xu, Tongda and He, Dailan and Qin, Hongwei and Wang, Yan &lt;a class="link" href="https://arxiv.org/pdf/2209.09244.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2022) &lt;strong&gt;Attention-Based Generative Neural Image Compression on Solar Dynamics Observatory&lt;/strong&gt; Zafari, Ali and Khoshkhahtinat, Atefeh and Mehta, Piyush M and Nasrabadi, Nasser M and Thompson, Barbara J and da Silva, Daniel and Kirk, Michael SF&lt;a class="link" href="https://arxiv.org/pdf/2210.06478.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2022) &lt;strong&gt;Progressive Deep Image Compression for Hybrid Contexts of Image Classification and Reconstruction&lt;/strong&gt; Lei, Zhongyue and Duan, Peng and Hong, Xuemin and Mota, Jo{~a}o FC and Shi, Jianghong and Wang, Cheng-Xiang &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9970515&amp;amp;casa_token=wr2tdLJpoSQAAAAA:yxNRSlqMzqo0libGY0kbkrP79VRTccC5BmKEzCC5ziY9shpizVudordovWx5BOFOgQSHC7dxrZs" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPRW 2022) &lt;strong&gt;Universal Deep Image Compression via Content-Adaptive Optimization with Adapters&lt;/strong&gt; Tsubota, Koki and Akutsu, Hiroaki and Aizawa, Kiyoharu &lt;a class="link" href="https://openaccess.thecvf.com/content/WACV2023/papers/Tsubota_Universal_Deep_Image_Compression_via_Content-Adaptive_Optimization_With_Adapters_WACV_2023_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPRW 2022) &lt;strong&gt;User-Guided Variable Rate Learned Image Compression&lt;/strong&gt; Gupta, Rushil and BV, Suryateja and Kapoor, Nikhil and Jaiswal, Rajat and Nangi, Sharmila Reddy and Kulkarni, Kuldeep&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022W/CLIC/papers/Gupta_User-Guided_Variable_Rate_Learned_Image_Compression_CVPRW_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPRW 2022) &lt;strong&gt;Adaptive Bitrate Quantization Scheme Without Codebook for Learned Image Compression&lt;/strong&gt; L{&amp;quot;o}hdefink, Jonas and Sitzmann, Jonas and B{&amp;quot;a}r, Andreas and Fingscheidt, Tim &lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022W/CLIC/papers/Lohdefink_Adaptive_Bitrate_Quantization_Scheme_Without_Codebook_for_Learned_Image_Compression_CVPRW_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TIP 2022) &lt;strong&gt;OSLO: On-the-Sphere Learning for Omnidirectional images and its application to 360-degree image compression&lt;/strong&gt; Bidgoli, Navid Mahmoudian and Roberto, G de A and Maugey, Thomas and Roumy, Aline and Frossard, Pascal &lt;a class="link" href="https://arxiv.org/pdf/2107.09179.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(AAAI 2022) &lt;strong&gt;Two-Stage Octave Residual Network for End-to-End Image Compression&lt;/strong&gt; Chen, Fangdong and Xu, Yumeng and Wang, Li &lt;a class="link" href="https://scholar.google.com/scholar?hl=zh-CN&amp;amp;as_sdt=0%2C5&amp;amp;q=Two-Stage&amp;#43;Octave&amp;#43;Residual&amp;#43;Network&amp;#43;for&amp;#43;End-to-End&amp;#43;Image&amp;#43;Compression&amp;amp;btnG=#:~:text=%E5%B9%B4%E4%BB%BD-,%5BPDF%5D%20aaai.org,-Two%2DStage%20Octave" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2022) &lt;strong&gt;Preprocessing Enhanced Image Compression for Machine Vision&lt;/strong&gt; Lu, Guo and Ge, Xingtong and Zhong, Tianxiong and Geng, Jing and Hu, Qiang &lt;a class="link" href="https://arxiv.org/pdf/2206.05650.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2022) &lt;strong&gt;Learning-Driven Lossy Image Compression; A
Comprehensive Survey&lt;/strong&gt; Jamil, Sonain and Piran, Md and others &lt;a class="link" href="https://arxiv.org/pdf/2201.09240.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2022) &lt;strong&gt;Estimating the Resize Parameter in End-to-end Learned Image Compression&lt;/strong&gt; Chen, Li-Heng and Bampis, Christos G and Li, Zhi and Krasula, Luk{'a}{\v{s}} and Bovik, Alan C &lt;a class="link" href="https://arxiv.org/pdf/2204.12022.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2022) &lt;strong&gt;Image Compression with Product Quantized Masked Image Modeling&lt;/strong&gt; El-Nouby, Alaaeldin and Muckley, Matthew J and Ullrich, Karen and Laptev, Ivan and Verbeek, Jakob and J{'e}gou, Herv{'e} &lt;a class="link" href="https://arxiv.org/pdf/2212.07372.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ITJ 2022) &lt;strong&gt;Human–Machine Interaction-Oriented Image Coding for Res8ource-Constrained Visual Monitoring in IoT&lt;/strong&gt;
Wang, Zixi and Li, Fan and Xu, Jing and Cosman, Pamela C &lt;a class="link" href="" &gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TGRS 2022) &lt;strong&gt;Towards simultaneous image compression and indexing for scalable content-based retrieval in remote sensing&lt;/strong&gt; Sumbul, Gencer and Xiang, Jun and Demir, Beg{&amp;quot;u}m &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9878355" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(SPI 2022) &lt;strong&gt;Rate-constrained learning-based image compression&lt;/strong&gt; &lt;a class="link" href="" &gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TIP 2022) &lt;strong&gt;Exploiting Intra-Slice and Inter-Slice Redundancy for Learning-Based Lossless Volumetric Image Compression&lt;/strong&gt; Chen, Zhenghao and Gu, Shuhang and Lu, Guo and Xu, Dong &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9694511&amp;amp;casa_token=_INFRj8nkRkAAAAA:_4VWc5Q56n7hHUi5xnIS3Yyno0YRwyVWQdEnU2XqmAV6Sv_XnG7SgBnO0DfYUnoLuNP-3iKOivk" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt; lossless&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2022) &lt;strong&gt;Entroformer: A transformer-based entropy model for learned image compression&lt;/strong&gt; Qian, Yichen and Lin, Ming and Sun, Xiuyu and Tan, Zhiyu and Jin, Rong &lt;a class="link" href="https://arxiv.org/pdf/2202.05492.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(Arxiv 2022) &lt;strong&gt;Multi-Sample Training for Neural Image Compression&lt;/strong&gt; Xu, Tongda and Wang, Yan and He, Dailan and Gao, Chenjian and Gao, Han and Liu, Kunzan and Qin, Hongwei &lt;a class="link" href="https://arxiv.org/pdf/2209.13834.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2022) &lt;strong&gt;ELIC: Efficient Learned Image Compression with Unevenly Grouped Space-Channel Contextual Adaptive Coding&lt;/strong&gt; He, Dailan and Yang, Ziming and Peng, Weikun and Ma, Rui and Qin, Hongwei and Wang, Yan &lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022/papers/He_ELIC_Efficient_Learned_Image_Compression_With_Unevenly_Grouped_Space-Channel_Contextual_CVPR_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ECCV 2022) &lt;strong&gt;Contextformer: A Transformer with
Spatio-Channel Attention for Context Modeling
in Learned Image Compression&lt;/strong&gt; Koyuncu, A Burakhan and Gao, Han and Boev, Atanas and Gaikov, Georgii and Alshina, Elena and Steinbach, Eckehard &lt;a class="link" href="https://arxiv.org/pdf/2203.02452.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ECCV 2022) &lt;strong&gt;Contextformer: A Transformer with Spatio-Channel Attention for Context Modeling in Learned Image Compression&lt;/strong&gt; Koyuncu, A Burakhan and Gao, Han and Boev, Atanas and Gaikov, Georgii and Alshina, Elena and Steinbach, Eckehard &lt;a class="link" href="https://link.springer.com/content/pdf/10.1007/978-3-031-19800-7_26.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ECCV 2022) &lt;strong&gt;Content-Oriented Learned Image
Compression&lt;/strong&gt; Li, Meng and Gao, Shangyin and Feng, Yihui and Shi, Yibo and Wang, Jing &lt;a class="link" href="https://link.springer.com/content/pdf/10.1007/978-3-031-19800-7_37.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ECCV 2022) &lt;strong&gt;Implicit Neural Representations
for Image Compression&lt;/strong&gt; Str{&amp;quot;u}mpler, Yannick and Postels, Janis and Yang, Ren and Gool, Luc Van and Tombari, Federico &lt;a class="link" href="https://arxiv.org/pdf/2112.04267.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ECCV 2022) &lt;strong&gt;Content Adaptive Latents and Decoder for Neural Image Compression&lt;/strong&gt; Pan, Guanbo and Lu, Guo and Hu, Zhihao and Xu, Dong &lt;a class="link" href="https://arxiv.org/pdf/2212.10132.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ECCV 2022) &lt;strong&gt;Optimizing Image Compression via Joint
Learning with Denoising&lt;/strong&gt; Cheng, Ka Leong and Xie, Yueqi and Chen, Qifeng &lt;a class="link" href="https://link.springer.com/content/pdf/10.1007/978-3-031-19800-7_4.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt; denoising&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(2022) &lt;strong&gt;2C-Net: Integrate Image Compression and Classification via Deep Neural Network&lt;/strong&gt; Liu, Linfeng and Chen, Tong and Liu, Haojie and Pu, Shiliang and Wang, Li and Shen, Qiu &lt;a class="link" href="https://assets.researchsquare.com/files/rs-2049607/v1_covered.pdf?c=1663278884" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ACMMM 2022) &lt;strong&gt;High-Fidelity Variable-Rate Image Compression via Invertible Activation Transformation&lt;/strong&gt; Cai, Shilv and Zhang, Zhijun and Chen, Liqun and Yan, Luxin and Zhong, Sheng and Zou, Xu [&lt;a class="link" href="https://arxiv.org/pdf/2209.05054.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arxiv 2022) &lt;strong&gt;Deep Lossy Plus Residual Coding for Lossless and Near-lossless Image Compression&lt;/strong&gt; Bai, Yuanchao and Liu, Xianming and Wang, Kai and Ji, Xiangyang and Wu, Xiaolin and Gao, Wen [&lt;a class="link" href="https://arxiv.org/pdf/2209.04847.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TCSVT 2022) &lt;strong&gt;End-to-End Optimized Image Compression With Deep Gaussian Process Regression&lt;/strong&gt; Cao, Maida and Dai, Wenrui and Li, Shaohui and Li, Chenglin and Zou, Junni and Chen, Ying and Xiong, Hongkai [&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;amp;arnumber=9903432" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TIP 2022) &lt;strong&gt;End-to-end optimized 360° image compression&lt;/strong&gt; Li, Mu and Li, Jinxing and Gu, Shuhang and Wu, Feng and Zhang, David [&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;amp;arnumber=9904466" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arxiv 2022) &lt;strong&gt;Lossy Compression with Gaussian Diffusion&lt;/strong&gt; Theis, Lucas and Salimans, Tim and Hoffman, Matthew D and Mentzer, Fabian [&lt;a class="link" href="https://arxiv.org/pdf/2206.08889.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arxiv 2022) &lt;strong&gt;Joint Image Compression and Denoising via Latent-Space Scalability&lt;/strong&gt; Alvar, Saeed Ranjbar and Ulhaq, Mateen and Choi, Hyomin and Baji{'c}, Ivan V [&lt;a class="link" href="https://arxiv.org/pdf/2205.01874.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arxiv 2022) &lt;strong&gt;Post-Training Quantization for Cross-Platform Learned Image Compression&lt;/strong&gt; He, Dailan and Yang, Ziming and Chen, Yuan and Zhang, Qi and Qin, Hongwei and Wang, Yan [&lt;a class="link" href="https://arxiv.org/pdf/2202.07513.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICASSP 2022) &lt;strong&gt;Satellite Image Compression and Denoising With Neural Networks&lt;/strong&gt; Yin, Shanzhi and Li, Chao and Bao, Youneng and Liang, Yongsheng and Meng, Fanyang and Liu, Wei [&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;amp;arnumber=9747854" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICASSP 2022) &lt;strong&gt;AdderIC: Towards Low Computation Cost Image Compression&lt;/strong&gt; Li, Bowen and Xin, Yao and Li, Chao and Bao, Youneng and Meng, Fanyang and Liang, Yongsheng [&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;amp;arnumber=9747652" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(IEEE Geoscience and Remote Sensing Letters 2022) &lt;strong&gt;Universal Efficient Variable-Rate Neural Image Compression&lt;/strong&gt; de Oliveira, Vinicius Alves and Chabert, Marie and Oberlin, Thomas and Poulliat, Charly and Bruno, Mickael and Latry, Christophe and Carlavan, Mikael and Henrot, Simon and Falzon, Frederic and Camarero, Roberto [&lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;amp;arnumber=9690871" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2022) &lt;strong&gt;The Devil Is in the Details: Window-Based Attention for Image Compression&lt;/strong&gt; Zou, Renjie and Song, Chunfeng and Zhang, Zhaoxiang &lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022/papers/Zou_The_Devil_Is_in_the_Details_Window-Based_Attention_for_Image_CVPR_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2022) &lt;strong&gt;Joint Global and Local Hierarchical Priors for Learned Image Compression&lt;/strong&gt;, Kim, Jun-Hyuk and Heo, Byeongho and Lee, Jong-Seok &lt;a class="link" href="https://arxiv.org/pdf/2112.04487.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (E)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2022) &lt;strong&gt;RIDDLE: Lidar Data Compression with Range Image Deep Delta Encoding&lt;/strong&gt; Zhou, Xuanyu and Qi, Charles R and Zhou, Yin and Anguelov, Dragomir [&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022/papers/Zhou_RIDDLE_Lidar_Data_Compression_With_Range_Image_Deep_Delta_Encoding_CVPR_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2022) &lt;strong&gt;Neural Data-Dependent Transform for Learned Image Compression&lt;/strong&gt; Wang, Dezhao and Yang, Wenhan and Hu, Yueyu and Liu, Jiaying [&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022/papers/Wang_Neural_Data-Dependent_Transform_for_Learned_Image_Compression_CVPR_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPRW 2022) &lt;strong&gt;Self-Supervised Variable Rate Image Compression using Visual Attention&lt;/strong&gt; Sinha, Abhishek Kumar and Moorthi, S Manthira and Dhar, Debajyoti [&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022W/CLIC/papers/Sinha_Self-Supervised_Variable_Rate_Image_Compression_Using_Visual_Attention_CVPRW_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPRW 2022) &lt;strong&gt;User-Guided Variable Rate Learned Image Compression&lt;/strong&gt; Gupta, Rushil and BV, Suryateja and Kapoor, Nikhil and Jaiswal, Rajat and Nangi, Sharmila Reddy and Kulkarni, Kuldeep [&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022W/CLIC/papers/Gupta_User-Guided_Variable_Rate_Learned_Image_Compression_CVPRW_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2022) &lt;strong&gt;RDONet: Rate-Distortion Optimized Learned Image Compression With Variable Depth&lt;/strong&gt; Brand, Fabian and Fischer, Kristian and Kopte, Alexander and Windsheimer, Marc and Kaup, Andr{'e}. [&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022W/CLIC/papers/Brand_RDONet_Rate-Distortion_Optimized_Learned_Image_Compression_With_Variable_Depth_CVPRW_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2022) &lt;strong&gt;LC-FDNet: Learned Lossless Image Compression with Frequency Decomposition Network&lt;/strong&gt; Rhee, Hochang and Jang, Yeong Il and Kim, Seyun and Cho, Nam Ik. [&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022/papers/Rhee_LC-FDNet_Learned_Lossless_Image_Compression_With_Frequency_Decomposition_Network_CVPR_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2022) &lt;strong&gt;PO-ELIC: Perception-Oriented Efficient Learned Image Coding&lt;/strong&gt; He, Dailan and Yang, Ziming and Yu, Hongjiu and Xu, Tongda and Luo, Jixiang and Chen, Yuan and Gao, Chenjian and Shi, Xinjie and Qin, Hongwei and Wang, Yan. [&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022W/CLIC/papers/He_PO-ELIC_Perception-Oriented_Efficient_Learned_Image_Coding_CVPRW_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2022) &lt;strong&gt;Online Meta Adaptation for Variable-Rate Learned Image Compression&lt;/strong&gt; Jiang, Wei and Wang, Wei and Li, Songnan and Liu, Shan. [&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022W/NTIRE/papers/Jiang_Online_Meta_Adaptation_for_Variable-Rate_Learned_Image_Compression_CVPRW_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2022) &lt;strong&gt;Unified Multivariate Gaussian Mixture for Efficient Neural Image Compression&lt;/strong&gt; Zhu, Xiaosu and Song, Jingkuan and Gao, Lianli and Zheng, Feng and Shen, Heng Tao. [&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022/papers/Zhu_Unified_Multivariate_Gaussian_Mixture_for_Efficient_Neural_Image_Compression_CVPR_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2022) &lt;strong&gt;Split Hierarchical Variational Compression&lt;/strong&gt; Ryder, Tom and Zhang, Chen and Kang, Ning and Zhang, Shifeng. [&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022/papers/Ryder_Split_Hierarchical_Variational_Compression_CVPR_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2022) &lt;strong&gt;SASIC: Stereo Image Compression With Latent Shifts and Stereo Attention&lt;/strong&gt; W{&amp;quot;o}dlinger, Matthias and Kotera, Jan and Xu, Jan and Sablatnig, Robert. [&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022/papers/Wodlinger_SASIC_Stereo_Image_Compression_With_Latent_Shifts_and_Stereo_Attention_CVPR_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2022) &lt;strong&gt;Deep Stereo Image Compression via Bi-directional Coding&lt;/strong&gt;, Lei, Jianjun and Liu, Xiangrui and Peng, Bo and Jin, Dengchao and Li, Wanqing and Gu, Jingxiao [&lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2022/papers/Lei_Deep_Stereo_Image_Compression_via_Bi-Directional_Coding_CVPR_2022_paper.pdf" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(AAAI 2022) &lt;strong&gt;OoDHDR-Codec: Out-of-Distribution Generalization for HDR Image Compression&lt;/strong&gt;, Cao, Linfeng and Jiang, Aofan and Li, Wei and Wu, Huaying and Ye, Nanyang &lt;a class="link" href="https://www.aaai.org/AAAI22Papers/AAAI-8610.CaoL.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (HDR)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2022) &lt;strong&gt;Unified Multivariate Gaussian Mixture for Efficient Neural Image Compression&lt;/strong&gt;, Zhu, Xiaosu and Song, Jingkuan and Gao, Lianli and Zheng, Feng and Shen, Heng Tao &lt;a class="link" href="https://arxiv.org/pdf/2203.10897.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;a class="link" href="https://github.com/xiaosu-zhu/McQuic" target="_blank" rel="noopener"
&gt;[code]&lt;/a&gt; (E)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2022) &lt;strong&gt;Estimating the Resize Parameter in End-to-end Learned Image Compression&lt;/strong&gt;, Chen, Li-Heng and Bampis, Christos G and Li, Zhi and Krasula, Luk{'a}{\v{s}} and Bovik, Alan C &lt;a class="link" href="https://arxiv.org/pdf/2204.12022.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;a class="link" href="https://github.com/xiaosu-zhu/McQuic" target="_blank" rel="noopener"
&gt;[code]&lt;/a&gt; (Sa)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2022) &lt;strong&gt;DeepFGS: Fine-Grained Scalable Coding for Learned Image Compression&lt;/strong&gt;, Ma, Yi and Zhai, Yongqi and Wang, Ronggang &lt;a class="link" href="https://arxiv.org/pdf/2201.01173.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;(Sa)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2022) &lt;strong&gt;End-to-End Learned Block-Based Image Compression with Block-Level Masked Convolutions and Asymptotic Closed Loop Training&lt;/strong&gt;, Kamisli, Fatih &lt;a class="link" href="https://arxiv.org/pdf/2203.11686.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (T+E)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2022) &lt;strong&gt;Transformations in Learned Image Compression from Modulation Perspective&lt;/strong&gt;, Bao, Youneng and Meng, Fangyang and Tan, Wen and Li, Chao and Tian, Yonghong and Liang, Yongsheng &lt;a class="link" href="https://arxiv.org/pdf/2203.02158.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2022) &lt;strong&gt;Identity Preserving Loss for Learned Image Compression&lt;/strong&gt;, Xiao, Jiuhong and Aggarwal, Lavisha and Banerjee, Prithviraj and Aggarwal, Manoj and Medioni, Gerard &lt;a class="link" href="https://arxiv.org/pdf/2204.10869.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2022) &lt;strong&gt;High-Efficiency Lossy Image Coding Through Adaptive Neighborhood Information Aggregation&lt;/strong&gt;, Lu, Ming and Ma, Zhan &lt;a class="link" href="https://arxiv.org/pdf/2204.11448.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (E)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2022) &lt;strong&gt;Learning Weighting Map for Bit-Depth Expansion within a Rational Range&lt;/strong&gt;, Liu, Yuqing and Jia, Qi and Zhang, Jian and Fan, Xin and Wang, Shanshe and Ma, Siwei and Gao, Wen &lt;a class="link" href="https://arxiv.org/pdf/2204.12039.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; &lt;a class="link" href="https://github.com/yuqing-liu-dut/bit-depth-expansion" target="_blank" rel="noopener"
&gt;[code]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2022) &lt;strong&gt;Joint Image Compression and Denoising via Latent-Space Scalability&lt;/strong&gt;, Ranjbar Alvar, Saeed and Ulhaq, Mateen and Choi, Hyomin and Baji{'c}, Ivan V &lt;a class="link" href="https://arxiv.org/pdf/2205.01874.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="2021"&gt;✔2021
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;(TPAMI 2021) &lt;strong&gt;Learning end-to-end lossy image compression: A benchmark&lt;/strong&gt;, Hu, Yueyu and Yang, Wenhan and Ma, Zhan and Liu, Jiaying &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9376651" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; &lt;a class="link" href="https://github.com/huzi96/Coarse2Fine-PyTorch" target="_blank" rel="noopener"
&gt;[code]&lt;/a&gt;(Benchmark)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(IJCV 2021) &lt;strong&gt;Semantics-to-signal scalable image compression with learned revertible representations&lt;/strong&gt;, Liu, Kang and Liu, Dong and Li, Li and Yan, Ning and Li, Houqiang &lt;a class="link" href="https://link.springer.com/content/pdf/10.1007/s11263-021-01491-7.pdf1" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;[&lt;a class="link" href="https://github.com/micmic123/QmapCompression" target="_blank" rel="noopener"
&gt;code&lt;/a&gt;] (Scalable)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TIP 2021) &lt;strong&gt;Semantic Perceptual Image Compression With a Laplacian Pyramid of Convolutional Networks&lt;/strong&gt;, Wang, Juan and Duan, Yiping and Tao, Xiaoming and Xu, Mai and Lu, Jianhua &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;amp;arnumber=9381614" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICLR 2021) &lt;strong&gt;Hierarchical Image Compression Framework&lt;/strong&gt;, Ge, Yunying and Wang, Jing and Shi, Yibo and Gao, Shangyin &lt;a class="link" href="https://openreview.net/pdf?id=8rPXT-SVgjh" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICCV 2021) &lt;strong&gt;Variable-Rate Deep Image Compression through Spatially-Adaptive Feature Transform&lt;/strong&gt;, Song, Myungseo and Choi, Jinyoung and Han, Bohyung &lt;a class="link" href="https://openaccess.thecvf.com/content/ICCV2021/papers/Song_Variable-Rate_Deep_Image_Compression_Through_Spatially-Adaptive_Feature_Transform_ICCV_2021_paper.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (E)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2021) &lt;strong&gt;Asymmetric Gained Deep Image Compression With Continuous Rate Adaptation&lt;/strong&gt;, Cui, Ze and Wang, Jing and Gao, Shangyin and Guo, Tiansheng and Feng, Yihui and Bai, Bo &lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2021/papers/Cui_Asymmetric_Gained_Deep_Image_Compression_With_Continuous_Rate_Adaptation_CVPR_2021_paper.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; &lt;a class="link" href="https://github.com/mmSir/GainedVAE" target="_blank" rel="noopener"
&gt;[code]&lt;/a&gt;(VR)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2021) &lt;strong&gt;Checkerboard context model for efficient learned image compression&lt;/strong&gt;, He, Dailan and Zheng, Yaoyan and Sun, Baocheng and Wang, Yan and Qin, Hongwei &lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2021/papers/He_Checkerboard_Context_Model_for_Efficient_Learned_Image_Compression_CVPR_2021_paper.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; &lt;a class="link" href="https://github.com/leelitian/Checkerboard-Context-Model-Pytorch" target="_blank" rel="noopener"
&gt;[code1]&lt;/a&gt; &lt;a class="link" href="https://github.com/JiangWeibeta/Checkerboard-Context-Model-for-Efficient-Learned-Image-Compression" target="_blank" rel="noopener"
&gt;[code]&lt;/a&gt; (E)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 2021) &lt;strong&gt;Learning scalable ly=-constrained near-lossless image compression via joint lossy image and residual compression&lt;/strong&gt;, Bai, Yuanchao and Liu, Xianming and Zuo, Wangmeng and Wang, Yaowei and Ji, Xiangyang &lt;a class="link" href="[https://openaccess.thecvf.com/content/CVPR2021/papers/Cui_Asymmetric_Gained_Deep_Image_Compression_With_Continuous_Rate_Adaptation_CVPR_2021_paper.pdf" &gt;[paper]&lt;/a&gt; &lt;a class="link" href="https://github.com/mmSir/GainedVAE]%28https://openaccess.thecvf.com/content/CVPR2021/papers/Bai_Learning_Scalable_lY-Constrained_Near-Lossless_Image_Compression_via_Joint_Lossy_Image_CVPR_2021_paper.pdf%29" target="_blank" rel="noopener"
&gt;[code]&lt;/a&gt;(lossless)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPRW 2021) &lt;strong&gt;End-to-end optimized image compression with competition of prior distributions&lt;/strong&gt;, Brummer, Benoit and De Vleeschouwer, Christophe &lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2021W/CLIC/papers/Brummer_End-to-End_Optimized_Image_Compression_With_Competition_of_Prior_Distributions_CVPRW_2021_paper.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; &lt;a class="link" href="https://github.com/trougnouf/Manypriors" target="_blank" rel="noopener"
&gt;[code]&lt;/a&gt;(E)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPRW 2021) &lt;strong&gt;Subjective Quality Optimized Efficient Image Compression&lt;/strong&gt;, Wang, Xining and Chen, Tong and Ma, Zhan &lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2021W/CLIC/papers/Wang_Subjective_Quality_Optimized_Efficient_Image_Compression_CVPRW_2021_paper.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; &lt;a class="link" href="https://github.com/mmSir/GainedVAE" target="_blank" rel="noopener"
&gt;[code]&lt;/a&gt;(perceptual)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPRW 2021) &lt;strong&gt;Variable Rate ROI Image Compression Optimized for Visual Quality&lt;/strong&gt;, Ma, Yi and Zhai, Yongqi and Yang, Chunhui and Yang, Jiayu and Wang, Ruofan and Zhou, Jing and Li, Kai and Chen, Ying and Wang, Ronggang &lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2021W/CLIC/papers/Ma_Variable_Rate_ROI_Image_Compression_Optimized_for_Visual_Quality_CVPRW_2021_paper.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;(VR)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPRW 2021) &lt;strong&gt;Image Compression with Recurrent Neural Network and Generalized Divisive Normalization&lt;/strong&gt;, Islam, Khawar and Dang, L Minh and Lee, Sujin and Moon, Hyeonjoon &lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2021W/CLIC/papers/Islam_Image_Compression_With_Recurrent_Neural_Network_and_Generalized_Divisive_Normalization_CVPRW_2021_paper.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;a class="link" href="https://github.com/khawar-islam/cvpr" target="_blank" rel="noopener"
&gt;[code]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPRW 2021) &lt;strong&gt;End-to-End Learned Image Compression with Augmented Normalizing Flows&lt;/strong&gt;, Ho, Yung-Han and Chan, Chih-Chun and Peng, Wen-Hsiao and Hang, Hsueh-Ming &lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2021W/CLIC/papers/Islam_Image_Compression_With_Recurrent_Neural_Network_and_Generalized_Divisive_Normalization_CVPRW_2021_paper.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;a class="link" href="https://github.com/dororojames/anfic" target="_blank" rel="noopener"
&gt;[code]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPRW 2021) &lt;strong&gt;Learned Image Compression with Super-Resolution Residual Modules and DISTS Optimization&lt;/strong&gt;, Suzuki, Akifumi and Akutsu, Hiroaki and Naruko, Takahiro and Tsubota, Koki and Aizawa, Kiyoharu &lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2021W/CLIC/papers/Suzuki_Learned_Image_Compression_With_Super-Resolution_Residual_Modules_and_DISTS_Optimization_CVPRW_2021_paper.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (Perceptual)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPRW 2021) &lt;strong&gt;Perceptual Friendly Variable Rate Image Compression&lt;/strong&gt;, Gao, Yixin and Wu, Yaojun and Guo, Zongyu and Zhang, Zhizheng and Chen, Zhibo &lt;a class="link" href="https://openaccess.thecvf.com/content/CVPR2021W/CLIC/papers/Gao_Perceptual_Friendly_Variable_Rate_Image_Compression_CVPRW_2021_paper.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (VR+Perceptual)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(WACV 2021) &lt;strong&gt;Saliency Driven Perceptual Image Compression&lt;/strong&gt;, Patel, Yash and Appalaraju, Srikar and Manmatha, R &lt;a class="link" href="https://openaccess.thecvf.com/content/WACV2021/papers/Patel_Saliency_Driven_Perceptual_Image_Compression_WACV_2021_paper.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (perceputal)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TCSVT 2021) &lt;strong&gt;Causal contextual prediction for learned image compression&lt;/strong&gt;, Guo, Zongyu and Zhang, Zhizheng and Feng, Runsen and Chen, Zhibo &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9455349" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (E)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TCSVT 2021) &lt;strong&gt;Learned Block-based Hybrid Image Compression&lt;/strong&gt;, Wu, Yaojun and Li, Xin and Zhang, Zhizheng and Jin, Xin and Chen, Zhibo &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9455349" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (T+E)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ACMMM 2021) &lt;strong&gt;Enhanced Invertible Encoding for Learned Image Compression&lt;/strong&gt;, Yueqi Xie, Ka Leong Cheng, Qifeng Chen &lt;a class="link" href="https://dl.acm.org/doi/pdf/10.1145/3474085.3475213" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; &lt;a class="link" href="https://github.com/xyq7/InvCompress" target="_blank" rel="noopener"
&gt;[code]&lt;/a&gt; (Invertible)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ACMMM 2021) &lt;strong&gt;Semantic Scalable Image Compression with Cross-Layer Priors&lt;/strong&gt;, Tu, Hanyue and Li, Li and Zhou, Wengang and Li, Houqiang &lt;a class="link" href="https://dl.acm.org/doi/pdf/10.1145/3474085.3475533" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (Scalable)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ACMMM 2021) &lt;strong&gt;Interpolation Variable Rate Image Compression&lt;/strong&gt;, Sun, Zhenhong and Tan, Zhiyu and Sun, Xiuyu and Zhang, Fangyi and Qian, Yichen and Li, Dongyang and Li, Hao &lt;a class="link" href="https://dl.acm.org/doi/pdf/10.1145/3474085.3475698" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (VR)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(TMM 2021) &lt;strong&gt;Learned Multi-Resolution Variable-Rate Image Compression With Octave-Based Residual Blocks&lt;/strong&gt;, Akbari, Mohammad and Liang, Jie and Han, Jingning and Tu, Chengjie &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;amp;arnumber=9385968&amp;amp;tag=1" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (VR)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(DCC 2021) &lt;strong&gt;Accelerate Neural Image Compression with Channel-adaptive Arithmetic Coding&lt;/strong&gt;, uo, Zongyu and Fu, Jun and Feng, Runsen and Chen, Zhibo &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;amp;arnumber=9401277" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(ICIP 2021) &lt;strong&gt;Graph-Convolution Network for Image Compression&lt;/strong&gt;, Yang, Chunhui and Ma, Yi and Yang, Jiayu and Liu, Shiyi and Wang, Ronggang &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9506704" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(PMLR 2021) &lt;strong&gt;Soft then hard: Rethinking the quantization in neural image compression&lt;/strong&gt;, Z Guo，Z Zhang，R Feng，Z Chen &lt;a class="link" href="http://proceedings.mlr.press/v139/guo21c/guo21c.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2021) &lt;strong&gt;Learned Image Compression for Machine Perception&lt;/strong&gt;, Codevilla, Felipe and Simard, Jean Gabriel and Goroshin, Ross and Pal, Chris &lt;a class="link" href="https://arxiv.org/pdf/2111.02249.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (Perceptual)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2021) &lt;strong&gt;Substitutional Neural Image Compression&lt;/strong&gt;, Wang, Xiao and Jiang, Wei and Wang, Wei and Liu, Shan and Kulis, Brian and Chin, Peter &lt;a class="link" href="https://arxiv.org/pdf/2105.07512.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (VR)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2021) &lt;strong&gt;DPICT: Deep Progressive Image Compression Using Trit-Planes&lt;/strong&gt;, Lee, Jae-Han and Jeon, Seungmin and Choi, Kwang Pyo and Park, Youngo and Kim, Chang-Su &lt;a class="link" href="https://arxiv.org/pdf/2112.06334.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (VR)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2021) &lt;strong&gt;Implicit Neural Representations for Image Compression&lt;/strong&gt;, Str{&amp;quot;u}mpler, Yannick and Postels, Janis and Yang, Ren and Van Gool, Luc and Tombari, Federico &lt;a class="link" href="https://arxiv.org/pdf/2112.04267.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2021) &lt;strong&gt;A Novel Framework for Image-to-image Translation and Image Compression&lt;/strong&gt;, Yang, Fei and Wang, Yaxing and Herranz, Luis and Cheng, Yongmei and Mozerov, Mikhail &lt;a class="link" href="https://arxiv.org/pdf/2111.13105.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (I2I)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2021) &lt;strong&gt;Semantic-assisted image compression&lt;/strong&gt;, Sun, Qizheng and Guo, Caili and Yang, Yang and Chen, Jiujiu and Xue, Xijun &lt;a class="link" href="https://arxiv.org/pdf/2201.12599.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2021) &lt;strong&gt;End-to-End Learned Image Compression with Quantized Weights and Activations&lt;/strong&gt;, Sun, Heming and Yu, Lu and Katto, Jiro &lt;a class="link" href="https://arxiv.org/pdf/2111.09348.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2021) &lt;strong&gt;End-to-End Image Compression with Probabilistic Decoding&lt;/strong&gt;, Ma, Haichuan and Liu, Dong and Dong, Cunhui and Li, Li and Wu, Feng &lt;a class="link" href="https://arxiv.org/pdf/2109.14837.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2021) &lt;strong&gt;Towards End-to-End Image Compression and Analysis with Transformers&lt;/strong&gt;, Bai, Yuanchao and Yang, Xu and Liu, Xianming and Jiang, Junjun and Wang, Yaowei and Ji, Xiangyang and Gao, Wen &lt;a class="link" href="https://arxiv.org/pdf/2112.09300.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2021) &lt;strong&gt;A Cross Channel Context Model for Latents in Deep Image Compression&lt;/strong&gt;, Ma, Changyue and Wang, Zhao and Liao, Ruling and Ye, Yan &lt;a class="link" href="https://arxiv.org/pdf/2103.02884.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2021) &lt;strong&gt;Online Meta Adaptation for Variable-Rate Learned Image Compression&lt;/strong&gt;, Wei Jiang, Wei Wang, Songnan Li, Shan Liu &lt;a class="link" href="https://arxiv.org/abs/2111.08256" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt; (VR)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(arXiv preprint 2021) &lt;strong&gt;Transformer-based Image Compression&lt;/strong&gt;, Ming Lu, Peiyao Guo, Huiqing Shi, Chuntong Cao, Zhan Ma [&lt;a class="link" href="https://arxiv.org/abs/2111.06707" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="2020"&gt;✔2020
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;[arXiv preprint 2020] &lt;strong&gt;Lossless Image Compression through Super-Resolution&lt;/strong&gt;, Sheng Cao, Chao-Yuan Wu, Philipp Krähenbühl [&lt;a class="link" href="https://arxiv.org/abs/2004.02872" target="_blank" rel="noopener"
&gt;paper&lt;/a&gt;]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="2019"&gt;✔2019
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;(PCS 19) &lt;strong&gt;A novel deep progressive image compression framework&lt;/strong&gt;, Cai, Chunlei and Chen, Li and Zhang, Xiaoyun and Lu, Guo and Gao, Zhiyong. &lt;a class="link" href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8954500" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;(CVPR 19) &lt;strong&gt;Learning image and video compression through spatial-temporal energy compaction&lt;/strong&gt;, Cheng, Zhengxue and Sun, Heming and Takeuchi, Masaru and Katto, Jiro. &lt;a class="link" href="https://openaccess.thecvf.com/content_CVPR_2019/papers/Cheng_Learning_Image_and_Video_Compression_Through_Spatial-Temporal_Energy_Compaction_CVPR_2019_paper.pdf" target="_blank" rel="noopener"
&gt;[paper]&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="2018"&gt;✔2018
&lt;/h2&gt;&lt;hr&gt;</description></item><item><title>Awesome LLM Apps</title><link>https://hanguangwu.github.io/blog/en/p/awesome-llm-apps/</link><pubDate>Tue, 10 Feb 2026 18:34:25 -0800</pubDate><guid>https://hanguangwu.github.io/blog/en/p/awesome-llm-apps/</guid><description>&lt;h1 id="-awesome-llm-apps"&gt;🌟 Awesome LLM Apps
&lt;/h1&gt;&lt;h2 id="introduction"&gt;Introduction
&lt;/h2&gt;&lt;p&gt;A curated collection of &lt;strong&gt;Awesome LLM apps built with RAG, AI Agents, Multi-agent Teams, MCP, Voice Agents, and more.&lt;/strong&gt; This repository features LLM apps that use models from &lt;img src="https://cdn.simpleicons.org/openai" alt="openai logo" width="25" height="15"&gt;&lt;strong&gt;OpenAI&lt;/strong&gt; , &lt;img src="https://cdn.simpleicons.org/anthropic" alt="anthropic logo" width="25" height="15"&gt;&lt;strong&gt;Anthropic&lt;/strong&gt;, &lt;img src="https://cdn.simpleicons.org/googlegemini" alt="google logo" width="25" height="18"&gt;&lt;strong&gt;Google&lt;/strong&gt;, &lt;img src="https://cdn.simpleicons.org/x" alt="X logo" width="25" height="15"&gt;&lt;strong&gt;xAI&lt;/strong&gt; and open-source models like &lt;img src="https://cdn.simpleicons.org/alibabacloud" alt="alibaba logo" width="25" height="15"&gt;&lt;strong&gt;Qwen&lt;/strong&gt; or &lt;img src="https://cdn.simpleicons.org/meta" alt="meta logo" width="25" height="15"&gt;&lt;strong&gt;Llama&lt;/strong&gt; that you can run locally on your computer.&lt;/p&gt;
&lt;p&gt;&lt;a class="link" href="https://github.com/Shubhamsaboo/awesome-llm-apps" target="_blank" rel="noopener"
&gt;GitHub-Awesome LLM Apps&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;a class="link" href="https://www.theunwindai.com/" target="_blank" rel="noopener"
&gt;Collection of awesome LLM apps with AI Agents and RAG using OpenAI, Anthropic, Gemini and opensource models.&lt;/a&gt;&lt;/p&gt;
&lt;h2 id="-why-awesome-llm-apps"&gt;🤔 Why Awesome LLM Apps?
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;💡 Discover practical and creative ways LLMs can be applied across different domains, from code repositories to email inboxes and more.&lt;/li&gt;
&lt;li&gt;🔥 Explore apps that combine LLMs from OpenAI, Anthropic, Gemini, and open-source alternatives with AI Agents, Agent Teams, MCP &amp;amp; RAG.&lt;/li&gt;
&lt;li&gt;🎓 Learn from well-documented projects and contribute to the growing open-source ecosystem of LLM-powered applications.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="-featured-ai-projects"&gt;📂 Featured AI Projects
&lt;/h2&gt;&lt;h3 id="ai-agents"&gt;AI Agents
&lt;/h3&gt;&lt;h3 id="-starter-ai-agents"&gt;🌱 Starter AI Agents
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="starter_ai_agents/ai_blog_to_podcast_agent/" &gt;🎙️ AI Blog to Podcast Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="starter_ai_agents/ai_breakup_recovery_agent/" &gt;❤️‍🩹 AI Breakup Recovery Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="starter_ai_agents/ai_data_analysis_agent/" &gt;📊 AI Data Analysis Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="starter_ai_agents/ai_medical_imaging_agent/" &gt;🩻 AI Medical Imaging Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="starter_ai_agents/ai_meme_generator_agent_browseruse/" &gt;😂 AI Meme Generator Agent (Browser)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="starter_ai_agents/ai_music_generator_agent/" &gt;🎵 AI Music Generator Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="starter_ai_agents/ai_travel_agent/" &gt;🛫 AI Travel Agent (Local &amp;amp; Cloud)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="starter_ai_agents/gemini_multimodal_agent_demo/" &gt;✨ Gemini Multimodal Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="starter_ai_agents/mixture_of_agents/" &gt;🔄 Mixture of Agents&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="starter_ai_agents/xai_finance_agent/" &gt;📊 xAI Finance Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="starter_ai_agents/opeani_research_agent/" &gt;🔍 OpenAI Research Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="starter_ai_agents/web_scrapping_ai_agent/" &gt;🕸️ Web Scraping AI Agent (Local &amp;amp; Cloud SDK)&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="-advanced-ai-agents"&gt;🚀 Advanced AI Agents
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/ai_home_renovation_agent" &gt;🏚️ 🍌 AI Home Renovation Agent with Nano Banana Pro&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/single_agent_apps/ai_deep_research_agent/" &gt;🔍 AI Deep Research Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/agent_teams/ai_vc_due_diligence_agent_team" &gt;📊 AI VC Due Diligence Agent Team&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/single_agent_apps/research_agent_gemini_interaction_api" &gt;🔬 AI Research Planner &amp;amp; Executor (Google Interactions API)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/single_agent_apps/ai_consultant_agent" &gt;🤝 AI Consultant Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/single_agent_apps/ai_system_architect_r1/" &gt;🏗️ AI System Architect Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/ai_financial_coach_agent/" &gt;💰 AI Financial Coach Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/single_agent_apps/ai_movie_production_agent/" &gt;🎬 AI Movie Production Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/single_agent_apps/ai_investment_agent/" &gt;📈 AI Investment Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/single_agent_apps/ai_health_fitness_agent/" &gt;🏋️‍♂️ AI Health &amp;amp; Fitness Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/product_launch_intelligence_agent" &gt;🚀 AI Product Launch Intelligence Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/single_agent_apps/ai_journalist_agent/" &gt;🗞️ AI Journalist Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/ai_mental_wellbeing_agent/" &gt;🧠 AI Mental Wellbeing Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/single_agent_apps/ai_meeting_agent/" &gt;📑 AI Meeting Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/ai_Self-Evolving_agent/" &gt;🧬 AI Self-Evolving Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/agent_teams/ai_sales_intelligence_agent_team" &gt;👨🏻‍💼 AI Sales Intelligence Agent Team&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/ai_news_and_podcast_agents/" &gt;🎧 AI Social Media News and Podcast Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/accomplish-ai/openwork" target="_blank" rel="noopener"
&gt;🌐 Openwork - Open Browser Automation Agent&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="-autonomous-game-playing-agents"&gt;🎮 Autonomous Game Playing Agents
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/autonomous_game_playing_agent_apps/ai_3dpygame_r1/" &gt;🎮 AI 3D Pygame Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/autonomous_game_playing_agent_apps/ai_chess_agent/" &gt;♜ AI Chess Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/autonomous_game_playing_agent_apps/ai_tic_tac_toe_agent/" &gt;🎲 AI Tic-Tac-Toe Agent&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="-multi-agent-teams"&gt;🤝 Multi-agent Teams
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/agent_teams/ai_competitor_intelligence_agent_team/" &gt;🧲 AI Competitor Intelligence Agent Team&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/agent_teams/ai_finance_agent_team/" &gt;💲 AI Finance Agent Team&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/agent_teams/ai_game_design_agent_team/" &gt;🎨 AI Game Design Agent Team&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/agent_teams/ai_legal_agent_team/" &gt;👨‍⚖️ AI Legal Agent Team (Cloud &amp;amp; Local)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/agent_teams/ai_recruitment_agent_team/" &gt;💼 AI Recruitment Agent Team&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/agent_teams/ai_real_estate_agent_team" &gt;🏠 AI Real Estate Agent Team&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/agent_teams/ai_services_agency/" &gt;👨‍💼 AI Services Agency (CrewAI)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/agent_teams/ai_teaching_agent_team/" &gt;👨‍🏫 AI Teaching Agent Team&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/agent_teams/multimodal_coding_agent_team/" &gt;💻 Multimodal Coding Agent Team&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/agent_teams/multimodal_design_agent_team/" &gt;✨ Multimodal Design Agent Team&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_ai_agents/multi_agent_apps/agent_teams/multimodal_uiux_feedback_agent_team/" &gt;🎨 🍌 Multimodal UI/UX Feedback Agent Team with Nano Banana&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://hanguangwu.github.io/blog/advanced_ai_agents/multi_agent_apps/agent_teams/ai_travel_planner_agent_team/" &gt;🌏 AI Travel Planner Agent Team&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="-voice-ai-agents"&gt;🗣️ Voice AI Agents
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="voice_ai_agents/ai_audio_tour_agent/" &gt;🗣️ AI Audio Tour Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="voice_ai_agents/customer_support_voice_agent/" &gt;📞 Customer Support Voice Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="voice_ai_agents/voice_rag_openaisdk/" &gt;🔊 Voice RAG Agent (OpenAI SDK)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/akshayaggarwal99/jarvis-ai-assistant" target="_blank" rel="noopener"
&gt;🎙️ OpenSource Voice Dictation Agent (like Wispr Flow&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="mcp-ai-agents"&gt;&lt;img src="https://cdn.simpleicons.org/modelcontextprotocol" alt="mcp logo" width="25" height="20"&gt; MCP AI Agents
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="mcp_ai_agents/browser_mcp_agent/" &gt;♾️ Browser MCP Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="mcp_ai_agents/github_mcp_agent/" &gt;🐙 GitHub MCP Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="mcp_ai_agents/notion_mcp_agent" &gt;📑 Notion MCP Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="mcp_ai_agents/ai_travel_planner_mcp_agent_team" &gt;🌍 AI Travel Planner MCP Agent&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="-rag-retrieval-augmented-generation"&gt;📀 RAG (Retrieval Augmented Generation)
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/agentic_rag_embedding_gemma" &gt;🔥 Agentic RAG with Embedding Gemma&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/agentic_rag_with_reasoning/" &gt;🧐 Agentic RAG with Reasoning&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/ai_blog_search/" &gt;📰 AI Blog Search (RAG)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/autonomous_rag/" &gt;🔍 Autonomous RAG&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/contextualai_rag_agent/" &gt;🔄 Contextual AI RAG Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/corrective_rag/" &gt;🔄 Corrective RAG (CRAG)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/deepseek_local_rag_agent/" &gt;🐋 Deepseek Local RAG Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/gemini_agentic_rag/" &gt;🤔 Gemini Agentic RAG&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/hybrid_search_rag/" &gt;👀 Hybrid Search RAG (Cloud)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/llama3.1_local_rag/" &gt;🔄 Llama 3.1 Local RAG&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/local_hybrid_search_rag/" &gt;🖥️ Local Hybrid Search RAG&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/local_rag_agent/" &gt;🦙 Local RAG Agent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/rag-as-a-service/" &gt;🧩 RAG-as-a-Service&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/rag_agent_cohere/" &gt;✨ RAG Agent with Cohere&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/rag_chain/" &gt;⛓️ Basic RAG Chain&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/rag_database_routing/" &gt;📠 RAG with Database Routing&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="rag_tutorials/vision_rag/" &gt;🖼️ Vision RAG&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="-llm-apps-with-memory-tutorials"&gt;💾 LLM Apps with Memory Tutorials
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="advanced_llm_apps/llm_apps_with_memory_tutorials/ai_arxiv_agent_memory/" &gt;💾 AI ArXiv Agent with Memory&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_llm_apps/llm_apps_with_memory_tutorials/ai_travel_agent_memory/" &gt;🛩️ AI Travel Agent with Memory&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_llm_apps/llm_apps_with_memory_tutorials/llama3_stateful_chat/" &gt;💬 Llama3 Stateful Chat&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_llm_apps/llm_apps_with_memory_tutorials/llm_app_personalized_memory/" &gt;📝 LLM App with Personalized Memory&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_llm_apps/llm_apps_with_memory_tutorials/local_chatgpt_with_memory/" &gt;🗄️ Local ChatGPT Clone with Memory&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_llm_apps/llm_apps_with_memory_tutorials/multi_llm_memory/" &gt;🧠 Multi-LLM Application with Shared Memory&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="-chat-with-x-tutorials"&gt;💬 Chat with X Tutorials
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="advanced_llm_apps/chat_with_X_tutorials/chat_with_github/" &gt;💬 Chat with GitHub (GPT &amp;amp; Llama3)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_llm_apps/chat_with_X_tutorials/chat_with_gmail/" &gt;📨 Chat with Gmail&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_llm_apps/chat_with_X_tutorials/chat_with_pdf/" &gt;📄 Chat with PDF (GPT &amp;amp; Llama3)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_llm_apps/chat_with_X_tutorials/chat_with_research_papers/" &gt;📚 Chat with Research Papers (ArXiv) (GPT &amp;amp; Llama3)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_llm_apps/chat_with_X_tutorials/chat_with_substack/" &gt;📝 Chat with Substack&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_llm_apps/chat_with_X_tutorials/chat_with_youtube_videos/" &gt;📽️ Chat with YouTube Videos&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="-llm-optimization-tools"&gt;🎯 LLM Optimization Tools
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="advanced_llm_apps/llm_optimization_tools/toonify_token_optimization/" &gt;🎯 Toonify Token Optimization&lt;/a&gt; - Reduce LLM API costs by 30-60% using TOON format&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="advanced_llm_apps/llm_optimization_tools/headroom_context_optimization/" &gt;🧠 Headroom Context Optimization&lt;/a&gt; - Reduce LLM API costs by 50-90% through intelligent context compression for AI agents (includes persistent memory &amp;amp; MCP support)&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="-llm-fine-tuning-tutorials"&gt;🔧 LLM Fine-tuning Tutorials
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://cdn.simpleicons.org/google" alt="google logo" width="20" height="15"&gt; &lt;a class="link" href="advanced_llm_apps/llm_finetuning_tutorials/gemma3_finetuning/" &gt;Gemma 3 Fine-tuning&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;img src="https://cdn.simpleicons.org/meta" alt="meta logo" width="25" height="15"&gt; &lt;a class="link" href="advanced_llm_apps/llm_finetuning_tutorials/llama3.2_finetuning/" &gt;Llama 3.2 Fine-tuning&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="-ai-agent-framework-crash-course"&gt;🧑‍🏫 AI Agent Framework Crash Course
&lt;/h3&gt;&lt;p&gt;&lt;img src="https://cdn.simpleicons.org/google" alt="google logo" width="25" height="15"&gt; &lt;a class="link" href="ai_agent_framework_crash_course/google_adk_crash_course/" &gt;Google ADK Crash Course&lt;/a&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Starter agent; model‑agnostic (OpenAI, Claude)&lt;/li&gt;
&lt;li&gt;Structured outputs (Pydantic)&lt;/li&gt;
&lt;li&gt;Tools: built‑in, function, third‑party, MCP tools&lt;/li&gt;
&lt;li&gt;Memory; callbacks; Plugins&lt;/li&gt;
&lt;li&gt;Simple multi‑agent; Multi‑agent patterns&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;img src="https://cdn.simpleicons.org/openai" alt="openai logo" width="25" height="15"&gt; &lt;a class="link" href="ai_agent_framework_crash_course/openai_sdk_crash_course/" &gt;OpenAI Agents SDK Crash Course&lt;/a&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Starter agent; function calling; structured outputs&lt;/li&gt;
&lt;li&gt;Tools: built‑in, function, third‑party integrations&lt;/li&gt;
&lt;li&gt;Memory; callbacks; evaluation&lt;/li&gt;
&lt;li&gt;Multi‑agent patterns; agent handoffs&lt;/li&gt;
&lt;li&gt;Swarm orchestration; routing logic&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="-getting-started"&gt;🚀 Getting Started
&lt;/h2&gt;&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Clone the repository&lt;/strong&gt;&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;div class="chroma"&gt;
&lt;table class="lntable"&gt;&lt;tr&gt;&lt;td class="lntd"&gt;
&lt;pre tabindex="0" class="chroma"&gt;&lt;code&gt;&lt;span class="lnt"&gt;1
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class="lntd"&gt;
&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bash" data-lang="bash"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;git clone https://github.com/Shubhamsaboo/awesome-llm-apps.git
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Navigate to the desired project directory&lt;/strong&gt;&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;div class="chroma"&gt;
&lt;table class="lntable"&gt;&lt;tr&gt;&lt;td class="lntd"&gt;
&lt;pre tabindex="0" class="chroma"&gt;&lt;code&gt;&lt;span class="lnt"&gt;1
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class="lntd"&gt;
&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bash" data-lang="bash"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nb"&gt;cd&lt;/span&gt; awesome-llm-apps/starter_ai_agents/ai_travel_agent
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Install the required dependencies&lt;/strong&gt;&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;div class="chroma"&gt;
&lt;table class="lntable"&gt;&lt;tr&gt;&lt;td class="lntd"&gt;
&lt;pre tabindex="0" class="chroma"&gt;&lt;code&gt;&lt;span class="lnt"&gt;1
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class="lntd"&gt;
&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bash" data-lang="bash"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;pip install -r requirements.txt
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Follow the project-specific instructions&lt;/strong&gt; in each project&amp;rsquo;s &lt;code&gt;README.md&lt;/code&gt; file to set up and run the app.&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;</description></item><item><title>NeuroTechEDU's Awesome List of BCI-related Resources</title><link>https://hanguangwu.github.io/blog/en/p/neurotechedus-awesome-list-of-bci-related-resources/</link><pubDate>Mon, 02 Feb 2026 12:34:25 -0800</pubDate><guid>https://hanguangwu.github.io/blog/en/p/neurotechedus-awesome-list-of-bci-related-resources/</guid><description>&lt;h1 id="neurotechedus-awesome-list-of-bci-related-resources"&gt;NeuroTechEDU&amp;rsquo;s Awesome List of BCI-related Resources
&lt;/h1&gt;&lt;h2 id="introduction"&gt;Introduction
&lt;/h2&gt;&lt;p&gt;&lt;a class="link" href="https://github.com/NeuroTechX/awesome-bci" target="_blank" rel="noopener"
&gt;Curated Collection of BCI resources&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;This is a list of tools, resources, and learning materials related to Brain-Computer Interfaces (BCI). The list is maintained by the &lt;a class="link" href="https://neurotechx.com/" target="_blank" rel="noopener"
&gt;NeuroTechX&lt;/a&gt; community.&lt;/p&gt;
&lt;h2 id="software"&gt;Software
&lt;/h2&gt;&lt;h3 id="bci-experiment-design-and-analysis"&gt;BCI Experiment Design and Analysis
&lt;/h3&gt;&lt;p&gt;These applications help you design BCI experiments, run them, collect data, and analyze the results.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/NeuroTechX/eeg-expy" target="_blank" rel="noopener"
&gt;EEG-ExPy&lt;/a&gt;: Free &amp;amp; Open-Source (FOSS) Python library for EEG &amp;amp; experiment design, recording, and analysis. Maintained by the EEG-ExPy team within NeuroTechX. &lt;a class="link" href="https://bit.ly/m/eeg-expy-cns" target="_blank" rel="noopener"
&gt;CNS2024 Poster&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://openvibe.inria.fr/" target="_blank" rel="noopener"
&gt;OpenViBE&lt;/a&gt;: A software platform dedicated to designing, testing, and using Brain-Computer Interfaces, maintained by the OpenViBE Consortium.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.bci2000.org/mediawiki/index.php/Main_Page" target="_blank" rel="noopener"
&gt;BCI2000&lt;/a&gt;: Software suite with GUI based on C++ for data acquisition, stimulus presentation, and brain monitoring applications.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://neuroimage.usc.edu/brainstorm/" target="_blank" rel="noopener"
&gt;Brainstorm&lt;/a&gt;: Collaborative, open-source application dedicated to the analysis of brain recordings: MEG, EEG, fNIRS, ECoG, depth electrodes and multiunit electrophysiology.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.shifz.org/brainbay/" target="_blank" rel="noopener"
&gt;BrainBay&lt;/a&gt;: Bio- and neurofeedback application working with various hardware frameworks including OpenBCI/OpenEEG.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://okazolab.com/" target="_blank" rel="noopener"
&gt;EventIDE&lt;/a&gt;: EventIDE is a software platform for designing and running multimodal experiments, with an IDE.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://neuropype.io" target="_blank" rel="noopener"
&gt;NeuroPype&lt;/a&gt;: platform for real-time brain-computer interfacing (BCI), neuroimaging, and neural signal processing, which supports a range of biosignal modalities including EEG, fNIRS, ExG, etc.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://mne.tools/stable/install/mne_tools_suite.html" target="_blank" rel="noopener"
&gt;MNE&lt;/a&gt;: MNE-Python is an open-source Python module for processing, analysis, and visualization of functional neuroimaging data (EEG, MEG, sEEG, ECoG, and fNIRS). The tools suite includes interoperable packages in Python, MATLAB, C++, etc., which operate in GUI, CLI, or API.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.psychopy.org/builder/" target="_blank" rel="noopener"
&gt;PsychoPy Builder&lt;/a&gt;: PsychoPy is an open-source application for creating experiments in neuroscience, psychology, and psychophysics.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://psychtoolbox.org/" target="_blank" rel="noopener"
&gt;PsychToolBox&lt;/a&gt;: Psychophysics Toolbox Version 3 (PTB-3) is a free set of Matlab and GNU Octave functions for vision and neuroscience research.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="matlab-toolboxes"&gt;Matlab Toolboxes
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="http://sccn.ucsd.edu/eeglab/" target="_blank" rel="noopener"
&gt;EEGLab&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.fieldtriptoolbox.org/" target="_blank" rel="noopener"
&gt;FieldTrip&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://sccn.ucsd.edu/wiki/BCILAB" target="_blank" rel="noopener"
&gt;BCILab&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/bbci/bbci_public" target="_blank" rel="noopener"
&gt;BBCI&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://erpinfo.org/erplab" target="_blank" rel="noopener"
&gt;ERPLAB&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://psychtoolbox.org" target="_blank" rel="noopener"
&gt;Psychtoolbox&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://chronux.org/" target="_blank" rel="noopener"
&gt;Chronux&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="python-toolboxes"&gt;Python Toolboxes
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/thunder-project/thunder" target="_blank" rel="noopener"
&gt;Thunder&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/bbci/pyff" target="_blank" rel="noopener"
&gt;Pyff&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/bbci/mushu" target="_blank" rel="noopener"
&gt;Mushu&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/bbci/wyrm" target="_blank" rel="noopener"
&gt;Wyrm&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/curiositry/EEGrunt" target="_blank" rel="noopener"
&gt;EEGrunt&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://getcloudbrain.com/" target="_blank" rel="noopener"
&gt;Cloudbrain&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/mne-tools/mne-python" target="_blank" rel="noopener"
&gt;MNE-Python&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/strfry/OpenNFB" target="_blank" rel="noopener"
&gt;OpenNFB&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/octopicorn/bcikit" target="_blank" rel="noopener"
&gt;bcikit&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.psychopy.org/" target="_blank" rel="noopener"
&gt;PsychoPy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/PIA-Group/BioSPPy" target="_blank" rel="noopener"
&gt;BioSPPy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://timeflux.io" target="_blank" rel="noopener"
&gt;Timeflux&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/eegsynth/eegsynth" target="_blank" rel="noopener"
&gt;EEGsynth&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/pyRiemann/pyRiemann" target="_blank" rel="noopener"
&gt;pyRiemann&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/NeuroTechX/moabb" target="_blank" rel="noopener"
&gt;MOABB&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/nmc-costa/neuroprime" target="_blank" rel="noopener"
&gt;NeuroPrime&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://braindecode.org/dev/index.html" target="_blank" rel="noopener"
&gt;Braindecode&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://brainflow.org" target="_blank" rel="noopener"
&gt;Brainflow&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/neurotechx/eeg-expy" target="_blank" rel="noopener"
&gt;EEG-ExPy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/LMBooth/pybci" target="_blank" rel="noopener"
&gt;PyBCI&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/arctop/mw75-streamer" target="_blank" rel="noopener"
&gt;mw75-streamer&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="mobile-apps"&gt;Mobile Apps
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;MindMonitor: &lt;a class="link" href="https://apps.apple.com/ca/app/mind-monitor/id988527143" target="_blank" rel="noopener"
&gt;iOS App Store&lt;/a&gt;, &lt;a class="link" href="https://play.google.com/store/apps/details?id=com.sonicPenguins.museMonitor" target="_blank" rel="noopener"
&gt;Google Play Store&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;NeuroSky Android SDK: &lt;a class="link" href="https://github.com/pwittchen/neurosky-android-sdk" target="_blank" rel="noopener"
&gt;Google Play Store&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;EEG-101 (Now-deprecated): &lt;a class="link" href="https://github.com/NeuroTechX/eeg-101" target="_blank" rel="noopener"
&gt;Google Play Store&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="brain-visualizations"&gt;Brain Visualizations
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="http://brainbox.pasteur.fr/" target="_blank" rel="noopener"
&gt;BrainBox&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://brainbrowser.cbrain.mcgill.ca/" target="_blank" rel="noopener"
&gt;BrainBrowser&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://zzz.bwh.harvard.edu/luna/apps/moonlight/" target="_blank" rel="noopener"
&gt;Moonlight&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="raspberrypi-framework"&gt;RaspberryPi Framework
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://pieeg.com/" target="_blank" rel="noopener"
&gt;PiEEG&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/BlinoTech/BlinoTech.github.io" target="_blank" rel="noopener"
&gt;Blino PiNaps&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/AtlantsEmbedded/IntelliPi" target="_blank" rel="noopener"
&gt;IntelliPi&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="communication-protocols"&gt;Communication Protocols
&lt;/h3&gt;&lt;p&gt;These are some of the commonly used Communication protocols.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/sccn/labstreaminglayer" target="_blank" rel="noopener"
&gt;Lab Streaming Layer&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.opensoundcontrol.org/" target="_blank" rel="noopener"
&gt;Open Sound Control&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.fieldtriptoolbox.org/development/realtime/buffer_protocol/" target="_blank" rel="noopener"
&gt;FieldTrip buffer&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="hardware"&gt;Hardware
&lt;/h2&gt;&lt;p&gt;This section is separated into different sections based on the types of technology.&lt;/p&gt;
&lt;h3 id="eeg"&gt;EEG
&lt;/h3&gt;&lt;p&gt;&lt;a class="link" href="https://en.wikipedia.org/wiki/Electroencephalography" target="_blank" rel="noopener"
&gt;Electroencephalography&lt;/a&gt; is the most commonly used form of Neurotechnology. There are many options out there meaning that you can easily find a device that matches your needs and price.&lt;/p&gt;
&lt;h4 id="consumer-and-diy-devices"&gt;Consumer and DIY Devices
&lt;/h4&gt;&lt;p&gt;Some of these devices are still supported and actively developed by manufacturers, community members, or researchers. Others are no longer supported but may still have a community of users who can help you get access.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://choosemuse.com/" target="_blank" rel="noopener"
&gt;Muse 2016, Muse 2, Muse S&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://openbci.com" target="_blank" rel="noopener"
&gt;OpenBCI Ganglion, Cyton, Daisy, Galea&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://iduntechnologies.com/idun-guardian" target="_blank" rel="noopener"
&gt;IDUN Guardian&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://neurable.com/" target="_blank" rel="noopener"
&gt;Neurable MW75 Neuro&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://neurosity.co/" target="_blank" rel="noopener"
&gt;Neurosity Crown&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://brainbit.com/" target="_blank" rel="noopener"
&gt;BrainBit Headband &amp;amp; Flex&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://emotiv.com" target="_blank" rel="noopener"
&gt;Emotiv EPOC, Flex, Insight&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://beacon.bio/dreem-headband/" target="_blank" rel="noopener"
&gt;Dreem by Beacon Biosignals&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.cognionics.com/" target="_blank" rel="noopener"
&gt;Cognionics&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://elemindtech.com/" target="_blank" rel="noopener"
&gt;Elemind&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://neurosky.com/" target="_blank" rel="noopener"
&gt;Neurosky&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.crowdsupply.com/neuroidss/freeeeg32" target="_blank" rel="noopener"
&gt;FreeEEG32: an open source 32 channel eeg&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://bakerdh.wordpress.com/2013/01/31/a-first-look-at-the-olimex-eeg-smt/" target="_blank" rel="noopener"
&gt;EEG-SMT by Olimex&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.crowdsupply.com/starcat/hackeeg" target="_blank" rel="noopener"
&gt;HackEEG&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://icibici.github.io/site/" target="_blank" rel="noopener"
&gt;icibici&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://openeeg.sourceforge.net/doc/" target="_blank" rel="noopener"
&gt;OpenEEG&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="research-devices-manufactures"&gt;Research Devices Manufactures
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="http://www.wearablesensing.com" target="_blank" rel="noopener"
&gt;Wearable Sensing Dry Electrode EEG&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.gtec.at" target="_blank" rel="noopener"
&gt;g.tec&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.egi.com/" target="_blank" rel="noopener"
&gt;EGI High Density EEG&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.biosemi.com/" target="_blank" rel="noopener"
&gt;BioSemi&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.ant-neuro.com" target="_blank" rel="noopener"
&gt;ANT Neuro&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.advancedbrainmonitoring.com" target="_blank" rel="noopener"
&gt;Advanced Brain Monitoring&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.brainproducts.com/" target="_blank" rel="noopener"
&gt;Brain Products&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://mentalab.com/" target="_blank" rel="noopener"
&gt;Mentalab Explore&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.natus.com" target="_blank" rel="noopener"
&gt;Natus Neuro&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.tmsi.com/products/" target="_blank" rel="noopener"
&gt;TMSi&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="eeg-parts--supplies"&gt;EEG Parts &amp;amp; Supplies
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://conscious-labs.com/3-eeg-devices" target="_blank" rel="noopener"
&gt;Conscious Labs - EEG Supra Headphones&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.emotiv.com/products/flex-gel" target="_blank" rel="noopener"
&gt;Emotiv Flex Gel&lt;/a&gt; &amp;amp; &lt;a class="link" href="https://www.emotiv.com/products/flex-saline" target="_blank" rel="noopener"
&gt;Emotiv Flex Saline&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://fri-fl-shop.com/" target="_blank" rel="noopener"
&gt;Florida Research Instruments&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://oshpark.com/shared_projects/h2i1xBaW" target="_blank" rel="noopener"
&gt;DIY Electrode Design&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.ti.com/tool/ads1299eegfe-pdk" target="_blank" rel="noopener"
&gt;TI ADS1299EEG-FE&lt;/a&gt;: Analog Front End for EEG solutions. e.g., in OpenBCI Cyton.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://intantech.com" target="_blank" rel="noopener"
&gt;Intan Technologies&lt;/a&gt;: Microchips and miniature recording &amp;amp; stimulation headstages.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://shop.openbci.com/products/idun-dryode-kit" target="_blank" rel="noopener"
&gt;IDUN Dryode&lt;/a&gt;: Adhesive dry electrodes for EEG.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://bio-medical.com/supplies/eeg-electrodes.html" target="_blank" rel="noopener"
&gt;Bio-Medical&lt;/a&gt;: For supplies and consumables&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.sciencedirect.com/science/article/pii/S1388245704003906" target="_blank" rel="noopener"
&gt;Comparison of different types of electrodes&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="nirs"&gt;NIRS
&lt;/h3&gt;&lt;p&gt;Near-Infrared Spectroscopy (NIRS) is a technology that measures the concentration of hemoglobin in each brain region, which can be used to infer energy expenditure and hence higher activity in that region.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="http://www.artinis.com/" target="_blank" rel="noopener"
&gt;Artinis Medical Systems&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.cortivision.com/" target="_blank" rel="noopener"
&gt;CortiVision&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.hitachi-hightech.com/global/product_list/?ld=iis1&amp;amp;md=iis1-6" target="_blank" rel="noopener"
&gt;Hitachi Hightech&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://nirx.net/" target="_blank" rel="noopener"
&gt;NIRx&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.ssi.shimadzu.com/products/productgroup.cfm?subcatlink=tissueimaging" target="_blank" rel="noopener"
&gt;Shimadzu&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://kernel.co" target="_blank" rel="noopener"
&gt;Kernel Flow&lt;/a&gt;: EEG + TD-fNIRS&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="multimodal-neurotech"&gt;Multimodal Neurotech
&lt;/h3&gt;&lt;p&gt;These devices combine different type of sensors to measure or influence brain activity.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://foc.us/focus-eeg-dev-kit-instructions-guide/" target="_blank" rel="noopener"
&gt;Foc.us Dev kit: EEG,TDCS,fNIRS,TACS&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.neuroelectrics.com/" target="_blank" rel="noopener"
&gt;Neuroelectrics: EEG,TDCS&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.bitalino.com/" target="_blank" rel="noopener"
&gt;BITalino: EEG,EMG,ECG,EDA&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.emotibit.com/" target="_blank" rel="noopener"
&gt;Emotibit: EDA,PPG,Temperature&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="brain-stimulation"&gt;Brain Stimulation
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.instructables.com/id/Transcranial-Magnetic-Stimulation-TMS-Device/" target="_blank" rel="noopener"
&gt;DIY TMS&lt;/a&gt;: Transcranial Magnetic Stimulation (TMS)&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.bostonscientific.com/en-US/products/deep-brain-stimulation-systems.html" target="_blank" rel="noopener"
&gt;Boston Scientific&lt;/a&gt;: DBS, SCS&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.medtronic.com/us-en/index.html" target="_blank" rel="noopener"
&gt;Medtronic&lt;/a&gt;: DBS, tES, SCS&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.magstim.com/" target="_blank" rel="noopener"
&gt;Magstim&lt;/a&gt;: TMS&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.soterixmedical.com/" target="_blank" rel="noopener"
&gt;Soterix Medical&lt;/a&gt;: TDCS, tACS, tRNS&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://clarity-technologies.com/" target="_blank" rel="noopener"
&gt;Clarity&lt;/a&gt;: Light &amp;amp; Stimulation therapy for Alzheimer&amp;rsquo;s Disease&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://vielight.com/" target="_blank" rel="noopener"
&gt;Vielight&lt;/a&gt;: Transcranial Photobiomodulation&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.neuroelectrics.com/" target="_blank" rel="noopener"
&gt;Neuroelectrics&lt;/a&gt;: tDCS, tACS, tRNS&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.neuropace.com/" target="_blank" rel="noopener"
&gt;NeuroPace&lt;/a&gt;: RNS&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://nervexneurotech.com/" target="_blank" rel="noopener"
&gt;NerveX&lt;/a&gt;: VNS in canine epilepsy.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://neurosigma.com/" target="_blank" rel="noopener"
&gt;NeuroSigma&lt;/a&gt;: eTNS&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.brainsway.com/" target="_blank" rel="noopener"
&gt;Brainsway&lt;/a&gt;: Deep TMS&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="upcoming-neuroimaging-tech"&gt;Upcoming NeuroImaging Tech
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="http://fultrasound.eu/" target="_blank" rel="noopener"
&gt;Functional Ultrasound (FUS)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://en.wikipedia.org/wiki/Event-related_optical_signal" target="_blank" rel="noopener"
&gt;Event Related Optical Signal&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.researchgate.net/publication/223360817_Shedding_light_on_brain_function_The_event-related_optical_signal" target="_blank" rel="noopener"
&gt;Event-Related Optical Signal&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://arxiv.org/pdf/cond-mat/9906188.pdf" target="_blank" rel="noopener"
&gt;Quasi-Ballistic Photons. (The Tech being used by Facebook&amp;rsquo;s BCI)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/OpenEIT/EIT_PCB" target="_blank" rel="noopener"
&gt;Open Electrical Impedance Tomography&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9805039/" target="_blank" rel="noopener"
&gt;Optically Pumped Magnetometers (OPM)&lt;/a&gt;, e.g., &lt;a class="link" href="https://quspin.com/" target="_blank" rel="noopener"
&gt;QuSpin&lt;/a&gt; and &lt;a class="link" href="https://www.cercamagnetics.com/cerca-opm-meg" target="_blank" rel="noopener"
&gt;Cerca&lt;/a&gt;:
&lt;ul&gt;
&lt;li&gt;&amp;ldquo;Optically&amp;rdquo; stabilizing highly sensitive magnetometers to measure the change in magnetic fields due to neural activity.&lt;/li&gt;
&lt;li&gt;Does not need Helium cooling like conventional (SQUID) MEG, and hence is much smaller and lighter, and somewhat cheaper.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Diffused Optical Imaging: Used for instance by Mary Lou Jepken et al @ &lt;a class="link" href="https://www.openwater.health/" target="_blank" rel="noopener"
&gt;Openwater&lt;/a&gt;, aiming to build a portable MRI. More info on the tech:
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://en.wikipedia.org/wiki/Diffuse_optical_imaging" target="_blank" rel="noopener"
&gt;Diffuse optical imaging pt. 1 (wiki)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://drive.google.com/file/d/0B-G2rraXdWRlenk2U0QzbW9PdkU/view?usp=sharing" target="_blank" rel="noopener"
&gt;Diffuse Optical Imaging pt. 2&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="brain-databases"&gt;Brain Databases
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="http://sccn.ucsd.edu/~arno/fam2data/publicly_available_EEG_data.html" target="_blank" rel="noopener"
&gt;SCCN list of eeg/erp data for free public download&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.studycatalog.org/" target="_blank" rel="noopener"
&gt;EEG studies with the raw data&lt;/a&gt; - &lt;a class="link" href="http://www.bigeeg.org/" target="_blank" rel="noopener"
&gt;(from BigEEG Consortium)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://bnci-horizon-2020.eu/database/data-sets" target="_blank" rel="noopener"
&gt;BNCI Horizon Data Sets&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://fcon_1000.projects.nitrc.org/indi/cmi_eeg/" target="_blank" rel="noopener"
&gt;The Child Mind Institute MIPDB Dataset&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://memory.psych.upenn.edu/RAM" target="_blank" rel="noopener"
&gt;RAM (DARPA) Invasive Recording Dataset from U. Penn&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://mindbigdata.com/opendb/index.html" target="_blank" rel="noopener"
&gt;MindBigData MNIST of Brain Digits&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.mindbigdata.com/opendb/imagenet.html" target="_blank" rel="noopener"
&gt;MindBigData ImageNet of The Brain&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/meagmohit/EEG-Datasets" target="_blank" rel="noopener"
&gt;meagmohit&amp;rsquo;s List of EEG Datasets&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://openneuro.org/" target="_blank" rel="noopener"
&gt;OpenNeuro&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://physionet.org/" target="_blank" rel="noopener"
&gt;PhysioNet&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://sleepdata.org/" target="_blank" rel="noopener"
&gt;National Sleep Research Resource&lt;/a&gt;: A large collection of sleep data. Supported by the Sleep Research Society (SRS).&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://isip.piconepress.com/projects/" target="_blank" rel="noopener"
&gt;Temple University EEG Corpora&lt;/a&gt;: various datasets including health, epilepsy, artifactual, etc.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="tutorials-and-project-ideas"&gt;Tutorials and Project Ideas
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://eegedu.com" target="_blank" rel="noopener"
&gt;EEGEdu&lt;/a&gt;: Web-based live Tutorial on EEG and BCI, from basic to advanced. Maintained by the Mathewsons (&lt;a class="link" href="https://sites.psych.ualberta.ca/kylemathewson/" target="_blank" rel="noopener"
&gt;Ky&lt;/a&gt;&lt;a class="link" href="https://korymathewson.com/" target="_blank" rel="noopener"
&gt;Kor&lt;/a&gt;&lt;a class="link" href="https://www.linkedin.com/in/keyfer/" target="_blank" rel="noopener"
&gt;Key&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.frontiernerds.com/brain-hack" target="_blank" rel="noopener"
&gt;How to Hack Toy EEGs&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/bcimontreal/bci_workshop/blob/master/INSTRUCTIONS.md" target="_blank" rel="noopener"
&gt;BCI Workshop&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://sccn.ucsd.edu/wiki/Introduction_To_Modern_Brain-Computer_Interface_Design" target="_blank" rel="noopener"
&gt;Introduction to Modern BCI&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://eeghacker.blogspot.com/2015/03/brain-controlled-shark-attack.html" target="_blank" rel="noopener"
&gt;Brain-Controlled Shark Attack&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/neuralcubes/musephero" target="_blank" rel="noopener"
&gt;Controlling a sphero with a muse&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/jmanart/smartphone-bci" target="_blank" rel="noopener"
&gt;Building a 20 Euro EEG for your smartphone&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://web.archive.org/web/20240930191736/https://openvibe.inria.fr/forum/viewtopic.php?f=3&amp;amp;t=9668" target="_blank" rel="noopener"
&gt;Muse File Reader for OpenVibe&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/NeuroTechX/eeg-101" target="_blank" rel="noopener"
&gt;EEG 101: Interactive tutorial for Android and Muse&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/katie356/BrainwaveAnalyzer/tree/master/web-edition" target="_blank" rel="noopener"
&gt;Brainwave analyzer&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://naplab.ee.columbia.edu/bcilab.html" target="_blank" rel="noopener"
&gt;BCI Course offered by Columbia University&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/NeurotechBerkeley/bci-course" target="_blank" rel="noopener"
&gt;BCI Course at Berkeley by Pierre of NeuroTechX&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.humanbrainmapping.org/m/pages.cfm?pageID=3814" target="_blank" rel="noopener"
&gt;EEG and MRI Course offered by OHBM&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/inclusive-brains/prometheus-bci" target="_blank" rel="noopener"
&gt;Prometheus Multimodal BCI (Olympic Torch)&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="communities-and-blogs"&gt;Communities and Blogs
&lt;/h2&gt;&lt;h3 id="forums"&gt;Forums
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://neurobb.com/" target="_blank" rel="noopener"
&gt;NeuroBB&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://openbci.com/community/" target="_blank" rel="noopener"
&gt;OpenBCI Community&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://forum.choosemuse.com/" target="_blank" rel="noopener"
&gt;Muse Community&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://support.neurosky.com/discussions" target="_blank" rel="noopener"
&gt;NeuroSky&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://forum.emotiv.com/" target="_blank" rel="noopener"
&gt;Emotiv&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="blogs"&gt;Blogs
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://neurotechx.medium.com/" target="_blank" rel="noopener"
&gt;NeuroTechX Content Lab&lt;/a&gt;: Articles, tutorials, and interviews on neurotechnology&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://eegnewsletter.substack.com/" target="_blank" rel="noopener"
&gt;The EEG Newsletter&lt;/a&gt;: News, events, and resources in EEG. By Raquel E. London&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://nschawor.github.io/posts/" target="_blank" rel="noopener"
&gt;Natalie Schaworonkow&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.autodidacts.io/" target="_blank" rel="noopener"
&gt;Autodidact&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://strfry.org/blog/" target="_blank" rel="noopener"
&gt;Strfry&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://sites.google.com/site/fabienlotte/research/code-and-softwares" target="_blank" rel="noopener"
&gt;Fabien Lotte&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://eeghacker.blogspot.ca/" target="_blank" rel="noopener"
&gt;Chip Audette EEG Hacker&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://alexandre.barachant.org/" target="_blank" rel="noopener"
&gt;Alexandre Barachant&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://lambdaloop.com/" target="_blank" rel="noopener"
&gt;Pierre Karashchuk&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://phd.jfrey.info/" target="_blank" rel="noopener"
&gt;Jeremy Frey&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.irenevigueguix.com" target="_blank" rel="noopener"
&gt;Irene Vigué Guix&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="competitions"&gt;Competitions
&lt;/h2&gt;&lt;h3 id="data-competitions"&gt;Data Competitions
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.kaggle.com/c/grasp-and-lift-eeg-detection" target="_blank" rel="noopener"
&gt;Kaggle Grasp and Lift&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.kaggle.com/c/inria-bci-challenge" target="_blank" rel="noopener"
&gt;Kaggle Error Detection&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.kaggle.com/c/decoding-the-human-brain" target="_blank" rel="noopener"
&gt;Kaggle Decode the Human Brain&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.kaggle.com/c/seizure-prediction" target="_blank" rel="noopener"
&gt;Kaggle Seizure Prediction&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.kaggle.com/c/seizure-detection" target="_blank" rel="noopener"
&gt;Kaggle Seizure Detection&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.bbci.de/competition/iv/" target="_blank" rel="noopener"
&gt;BCI Competition&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.br41n.io/" target="_blank" rel="noopener"
&gt;BR41N.io BCI Competition&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="brain-controlled-competitions"&gt;Brain Controlled Competitions
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="http://braindronerace.com/" target="_blank" rel="noopener"
&gt;Brain Drone Competition&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.cybathlon.ethz.ch/" target="_blank" rel="noopener"
&gt;Cybathlon&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="conferences-and-events"&gt;Conferences and Events
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="http://bcisociety.org/events/" target="_blank" rel="noopener"
&gt;&lt;strong&gt;List&lt;/strong&gt;: Curated list of events (BCI Society)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://bcisociety.org/bci-thursdays-online-events/" target="_blank" rel="noopener"
&gt;BCI Thursdays (BCI Society)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://bcisociety.org/bci-meeting/" target="_blank" rel="noopener"
&gt;BCI Meeting&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.neurogamingconf.com/" target="_blank" rel="noopener"
&gt;NeuroGaming / XTech&lt;/a&gt; &lt;a class="link" href="https://www.youtube.com/user/NeuroGamingCon/videos" target="_blank" rel="noopener"
&gt;(Youtube Videos)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://chi2016.acm.org/wp/" target="_blank" rel="noopener"
&gt;CHI&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://conference.israelbrain.org/" target="_blank" rel="noopener"
&gt;BrainTech&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://brainsummit.com/" target="_blank" rel="noopener"
&gt;Brain Summit&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://nips.cc/" target="_blank" rel="noopener"
&gt;NIPS&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.sfn.org/" target="_blank" rel="noopener"
&gt;SfN&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.gtec.at/event/bci-neurotechnology-spring-school-2025/" target="_blank" rel="noopener"
&gt;g.tec SpringSchool on BCI&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="reading-material"&gt;Reading Material
&lt;/h2&gt;&lt;h3 id="papers"&gt;Papers
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.researchgate.net/publication/51727880_Multiclass_Brain-Computer_Interface_Classification_by_Riemannian_Geometry" target="_blank" rel="noopener"
&gt;Multiclass Brain-Computer Interface Classification by Riemannian Geometry&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.researchgate.net/publication/258144410_A_New_Generation_of_Brain-Computer_Interface_Based_on_Riemannian_Geometry" target="_blank" rel="noopener"
&gt;A New Generation of Brain-Computer Interface Based on Riemannian Geometry&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0130129" target="_blank" rel="noopener"
&gt;My Virtual Dream: Collective Neurofeedback in an Immersive Art Environment &lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3272647/" target="_blank" rel="noopener"
&gt;BCI Competition IV – Data Set I: Learning Discriminative Patterns for Self-Paced EEG-Based Motor Imagery Detection&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://doc.ml.tu-berlin.de/bbci/publications/BlaLemTreHauMue10.pdf" target="_blank" rel="noopener"
&gt;Single-Trial Analysis and Classification of ERP Components – a Tutorial&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.researchgate.net/publication/301817936_Interpretable_Deep_Neural_Networks_for_Single-Trial_EEG_Classification" target="_blank" rel="noopener"
&gt;Interpretable Deep Neural Networks for Single-Trial EEG Classification&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0148886" target="_blank" rel="noopener"
&gt;Large-Scale Assessment of a Fully Automatic Co-Adaptive Motor Imagery-Based Brain Computer Interface&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.nature.com/articles/srep25803" target="_blank" rel="noopener"
&gt;Word pair classification during imagined speech using direct brain recording&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.ncbi.nlm.nih.gov/pubmed/28275048" target="_blank" rel="noopener"
&gt;Brain-Computer Interfaces Review, Nicolelis &amp;amp; Lebedev. 2017&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.pnas.org/content/112/44/E6058.abstract" target="_blank" rel="noopener"
&gt;High-speed spelling with a noninvasive brain–computer interface&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0172400" target="_blank" rel="noopener"
&gt;A high-speed brain-computer interface (BCI) using dry EEG electrodes&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="introductory-books"&gt;Introductory Books
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="http://www.amazon.com/Beyond-Boundaries-Neuroscience-Connecting-Machines/dp/1250002613" target="_blank" rel="noopener"
&gt;Beyond Boundaries (Nicolellis)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.amazon.com/Rhythms-Brain-Gyorgy-Buzsaki/dp/0199828237" target="_blank" rel="noopener"
&gt;Rhythms of Brain (Buzsaki)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.amazon.com/Cycles-mind-rhythms-control-perception-ebook/dp/B013ZI5AIA" target="_blank" rel="noopener"
&gt;Cycles in mind (Cohen)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.amazon.com/Principles-Neural-Science-Eric-Kandel/dp/0838577016" target="_blank" rel="noopener"
&gt;Principles of Neural Science (Kandel et al)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.amazon.com/The-Future-Mind-Scientific-Understand/dp/038553082X" target="_blank" rel="noopener"
&gt;The Future of the Mind (Kaku)&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="technical-books"&gt;Technical Books
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="http://www.amazon.com/Brain-Computer-Interfacing-Introduction-Rajesh-Rao/dp/0521769418" target="_blank" rel="noopener"
&gt;Brain-Computer Interfacing: An Introduction (Rao)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.amazon.com/Brain-Computer-Interfaces-Principles-Jonathan-Wolpaw/dp/0195388852" target="_blank" rel="noopener"
&gt;Brain Computer Interfaces (Wolpaw)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://mitpress.mit.edu/books/analyzing-neural-time-series-data" target="_blank" rel="noopener"
&gt;Analyzing Neural Time Series Data (Cohen)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.springer.com/us/book/9781461449836" target="_blank" rel="noopener"
&gt;Imaging Brain Function with EEG (Freeman &amp;amp; Quiroga)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.amazon.com/MATLAB-Neuroscientists-Introduction-Scientific-Computing/dp/0123745519" target="_blank" rel="noopener"
&gt;Matlab for Neuroscientists&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://books.google.ca/books?id=EJeQ0hAB76gC&amp;amp;pg=PR3&amp;amp;redir_esc=y#v=onepage&amp;amp;q&amp;amp;f=false" target="_blank" rel="noopener"
&gt;Biomedical Optics&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://imotions.com/blog/eeg-books/" target="_blank" rel="noopener"
&gt;iMotions Top 10 EEG Books&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="signal-processing"&gt;Signal Processing
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="http://ocw.mit.edu/resources/res-6-007-signals-and-systems-spring-2011/" target="_blank" rel="noopener"
&gt;Signals &amp;amp; Systems MIT Class&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Berkeley DSP class &lt;a class="link" href="https://www.youtube.com/watch?v=6_-ljdxjwac&amp;amp;list=PL-XXv-cvA_iCUQkarn2fxB3NggnPF_dob" target="_blank" rel="noopener"
&gt;lectures&lt;/a&gt;, &lt;a class="link" href="https://inst.eecs.berkeley.edu/~ee123/sp15/" target="_blank" rel="noopener"
&gt;page&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.amazon.com/Signals-Systems-Edition-Alan-Oppenheim/dp/0138147574" target="_blank" rel="noopener"
&gt;Signals &amp;amp; Systems (Oppenheim, Willsky, Hamid)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.amazon.com/Discrete-Time-Signal-Processing-Edition-Prentice-Hall/dp/0137549202" target="_blank" rel="noopener"
&gt;Discrete-Time Signal Processing (2nd Edition) (Oppenheim, Schafer, Buck)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.youtube.com/@mikexcohen1" target="_blank" rel="noopener"
&gt;Data analysis lecturelets (Mike X Cohen)&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="schools--summer-courses"&gt;Schools &amp;amp; Summer Courses
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://neurotechmicrocreds.com/" target="_blank" rel="noopener"
&gt;NeuroTech MicroCredentials Course&lt;/a&gt;: An accredited series of theoretical and hands-on courses on Neurotechnology, offered by NeuroTechX and Queens University.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://neuromatch.io/courses/" target="_blank" rel="noopener"
&gt;Neuromatch Academy (NMA) Summer Schools&lt;/a&gt;: An online, community-driven set of summer schools in computational sciences&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://sincxpress.com/summerschool.html" target="_blank" rel="noopener"
&gt;Sinxpress summer schools&lt;/a&gt; by Mike X. Cohen&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://brainhack.org/" target="_blank" rel="noopener"
&gt;Brainhack&lt;/a&gt;: A community-driven, online, and in-person school for neurotech enthusiasts, happening in many cities around you!&lt;/li&gt;
&lt;li&gt;Recurring summer schools or community-maintained lists of Neurotech-related summer schools
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://nschawor.github.io/posts/2024/neuro-summer-schools/" target="_blank" rel="noopener"
&gt;List maintained by N. Schwar&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://nayanika-biswas.notion.site/58f1530bd891475eb92f1e2e4984022f?v=83fc50c53b3a4191aa6f7cdf8d9b4e40" target="_blank" rel="noopener"
&gt;List maintained by N. Biswas&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="other-resources"&gt;Other Resources
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.coursera.org/learn/medical-neuroscience" target="_blank" rel="noopener"
&gt;Neuroscience Duke Course (Coursera)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://web.archive.org/web/20230610175248/https://www.nuffieldbioethics.org/wp-content/uploads/2013/06/Novel_neurotechnologies_report_PDF_web_0.pdf" target="_blank" rel="noopener"
&gt;Novel Neurotechnologies Intervening in the Brain&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4072086/" target="_blank" rel="noopener"
&gt;Augment Human Cognition by optimizing cortical oscillations&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://open-neuroscience.com/" target="_blank" rel="noopener"
&gt;Open Neuroscience&lt;/a&gt; - a user-driven database of Open Source/Science projects related to Neurosciences&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/okbalefthanded/awesome-bci-reviews" target="_blank" rel="noopener"
&gt;Awesome-BCI-Reviews&lt;/a&gt; - Curated list of Brain-Computer Interface peer-reviewd published reviews and surveys ordered by year of publication.&lt;/li&gt;
&lt;/ul&gt;</description></item><item><title>GitHub Awesome NeuroScience</title><link>https://hanguangwu.github.io/blog/en/p/github-awesome-neuroscience/</link><pubDate>Thu, 29 Jan 2026 15:35:25 -0800</pubDate><guid>https://hanguangwu.github.io/blog/en/p/github-awesome-neuroscience/</guid><description>&lt;h1 id="awesome-neuroscience"&gt;Awesome Neuroscience
&lt;/h1&gt;&lt;h2 id="introduction"&gt;Introduction
&lt;/h2&gt;&lt;p&gt;Curated list of awesome neuroscience libraries, software and any content related to the domain.&lt;/p&gt;
&lt;p&gt;&lt;a class="link" href="https://en.wikipedia.org/wiki/Neuroscience" target="_blank" rel="noopener"
&gt;Neuroscience&lt;/a&gt; is the study of how the nervous system develops, its structure, and what it does. Neuroscientists focus on the brain and its impact on behavior and cognitive functions. Traditionally, neuroscience has been seen as a branch of biology, but it has grown to encompass a wide range of interdisciplinary fields that work together toward elucidating brain function at multiple levels of investigation.&lt;/p&gt;
&lt;h2 id="programming"&gt;Programming
&lt;/h2&gt;&lt;p&gt;Software, libraries and frameworks for development purposes.&lt;/p&gt;
&lt;h3 id="python"&gt;Python
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/nengo/nengo" target="_blank" rel="noopener"
&gt;Nengo&lt;/a&gt; - Library for creating and simulating large-scale brain models.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/nipy/nitime" target="_blank" rel="noopener"
&gt;Nitime&lt;/a&gt; - Timeseries analysis for neuroscience data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/nilearn/nilearn" target="_blank" rel="noopener"
&gt;Nilearn&lt;/a&gt; - Module for performing statistical learning/machine learning on NeuroImaging data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/nipy/dipy" target="_blank" rel="noopener"
&gt;DIPY&lt;/a&gt; - Toolbox for analysis of MR diffusion imaging.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/mne-tools/mne-python" target="_blank" rel="noopener"
&gt;MNE-Python&lt;/a&gt; - Community-driven software for processing time-resolved neural signals including electroencephalography (EEG) and magnetoencephalography (MEG).&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/nipy/nibabel" target="_blank" rel="noopener"
&gt;NiBabel&lt;/a&gt; - Provides read and write access to some common medical and neuroimaging file formats.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/psychopy/psychopy" target="_blank" rel="noopener"
&gt;PsychoPy&lt;/a&gt; - Package for running psychology and neuroscience experiments. It allows for creating psychology stimuli in Python.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/numenta/nupic" target="_blank" rel="noopener"
&gt;NuPic&lt;/a&gt; - Numenta Platform for Intelligent Computing is an implementation of Hierarchical Temporal Memory (HTM), a theory of intelligence based strictly on the neuroscience of the neocortex.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/brian-team/brian2" target="_blank" rel="noopener"
&gt;Brian2&lt;/a&gt; - Free, open source simulator for spiking neural networks.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/expyriment/expyriment" target="_blank" rel="noopener"
&gt;expyriment&lt;/a&gt; - Platform-independent lightweight Python library for designing and conducting timing-critical behavioural and neuroimaging experiments.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/Hananel-Hazan/bindsnet" target="_blank" rel="noopener"
&gt;BindsNET&lt;/a&gt; - Package for simulating spiking neural networks for reinforcement &amp;amp; machine learning.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/SpikeInterface/spikeinterface" target="_blank" rel="noopener"
&gt;SpikeInterface&lt;/a&gt; - Framework designed to unify spike-sorting technologies&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://nimare.readthedocs.io/en/latest/" target="_blank" rel="noopener"
&gt;NiMARE&lt;/a&gt; - NiMARE is a Python package for neuroimaging meta-analyses&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="matlab"&gt;Matlab
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://bdtoolbox.org/" target="_blank" rel="noopener"
&gt;Brain Dynamics Toolbox&lt;/a&gt; - Open software for simulating dynamical systems in neuroscience.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://neuroimage.usc.edu/brainstorm/" target="_blank" rel="noopener"
&gt;BrainStorm&lt;/a&gt; - Open-source application dedicated to the analysis of brain recordings (MEG, EEG, fNIRS, ECoG, depth electrodes and multiunit electrophysiology).&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://sccn.ucsd.edu/eeglab/" target="_blank" rel="noopener"
&gt;EEGLAB&lt;/a&gt; - Interactive Matlab toolbox for processing continuous and event-related EEG, MEG and other electrophysiological data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/fieldtrip/fieldtrip" target="_blank" rel="noopener"
&gt;FieldTrip&lt;/a&gt; - Toolbox for MEG and EEG analysis.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://psychtoolbox.org/" target="_blank" rel="noopener"
&gt;Psychtoolbox-3&lt;/a&gt; - Free set of Matlab and GNU Octave functions for vision and neuroscience research.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.fil.ion.ucl.ac.uk/spm/" target="_blank" rel="noopener"
&gt;SPM&lt;/a&gt; - Free and open source software for the analysis of brain imaging data sequences (fMRI, PET, SPECT, EEG, MEG).&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="c"&gt;C++
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/BlueBrain/Brayns" target="_blank" rel="noopener"
&gt;Brayns&lt;/a&gt; - Minimalistic visualiser that can perform ray-traced rendering of neurons. Ray-tracing can help to highlight areas of neural circuits where cells touch each other and where synapses are being created leading to a better understanding of how individual cells and subsequently the brain functions.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="javascript"&gt;JavaScript
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/aces/brainbrowser" target="_blank" rel="noopener"
&gt;Brainbrowser&lt;/a&gt; - Library exposing set of web-based 3D visualization tools primarily targetting neuroimaging.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.jspsych.org/" target="_blank" rel="noopener"
&gt;jsPsych&lt;/a&gt; - Library for creating and running behavioural experiments in a web browser.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="r"&gt;R
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/jefferis/nat" target="_blank" rel="noopener"
&gt;nat: NeuroAnatomy Toolbox&lt;/a&gt; - Package for the (3D) visualisation and analysis of biological image data, especially tracings of single neurons.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/cwatson/brainGraph" target="_blank" rel="noopener"
&gt;brainGraph&lt;/a&gt; - Package for performing graph theory analyses of brain MRI data.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="resources"&gt;Resources
&lt;/h2&gt;&lt;p&gt;Interesting resources related to neuroscience.&lt;/p&gt;
&lt;h3 id="ebooks"&gt;Ebooks
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="http://nba.uth.tmc.edu/neuroscience/m/index.htm" target="_blank" rel="noopener"
&gt;Neuroscience Online&lt;/a&gt; - Open-access electronic textbook and interactive courseware covering neuroscience in depth. Provided by the Department of Neurobiology and Anantomy at the University of Texas Medical School at Houston.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://grey.colorado.edu/CompCogNeuro/index.php/CCNBook/Main" target="_blank" rel="noopener"
&gt;Computational Cognitive Neuroscience&lt;/a&gt; - Text which provides an in-depth introduction to the main ideas in the computational cognitive neuroscience, a field which aims to understand the brain by using biologically based computational models.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://neuronaldynamics.epfl.ch" target="_blank" rel="noopener"
&gt;Neuronal Dynamics&lt;/a&gt; - Open-access electronic textbook that covers computational and theoretical neuroscience. Provided by École Polytechnique Fédérale de Lausanne (EPFL).&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://andysbrainbook.readthedocs.io/en/latest/" target="_blank" rel="noopener"
&gt;Andy&amp;rsquo;s Brain Book&lt;/a&gt; - Book companion to &lt;a class="link" href="https://www.andysbrainblog.com/" target="_blank" rel="noopener"
&gt;Andy&amp;rsquo;s Brain Blog&lt;/a&gt;. Provides an introduction to working in a Unix environment, fMRI analysis, and commonplace neuroimaging tools and topics.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://textbook.nipraxis.org/intro.html" target="_blank" rel="noopener"
&gt;NiPraxis&lt;/a&gt; - Textbook for the &lt;a class="link" href="https://nipraxis.org/" target="_blank" rel="noopener"
&gt;NiPraxis course&lt;/a&gt;, covers fundamental concepts in neuroimaging analysis and how they relate to the wider world of statistics, engineering and computer science. Learn how to work with data and code to get a deeper understanding of how fMRI methods work, how they can fail, how to fix them, and how to develop new methods.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="blogs"&gt;Blogs
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.discovermagazine.com/author/neuroskeptic" target="_blank" rel="noopener"
&gt;Neuroskeptic&lt;/a&gt; - &lt;a class="link" href="http://discovermagazine.com/" target="_blank" rel="noopener"
&gt;Discover magazine&lt;/a&gt;&amp;rsquo;s neuroscience blog which offers a look at the latest developments in neuroscience, psychiatry and psychology through a critical lens.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://neurocritic.blogspot.in/" target="_blank" rel="noopener"
&gt;The Neurocritic&lt;/a&gt; - Often critical takes on the most sensationalistic recent findings in Human Brain Imaging, Cognitive Neuroscience, and Psychopharmacology.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://blogs.scientificamerican.com/scicurious-brain/" target="_blank" rel="noopener"
&gt;The scicurious brain&lt;/a&gt; - Maintained by &lt;a class="link" href="https://blogs.scientificamerican.com/" target="_blank" rel="noopener"
&gt;Scientific American&lt;/a&gt;, this blog typically covers one research paper in a single entry.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://blogs.nature.com/actionpotential" target="_blank" rel="noopener"
&gt;Action Potential&lt;/a&gt; - Forum operated by neuroscience editors at the journal, Nature.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.andysbrainblog.com/" target="_blank" rel="noopener"
&gt;Andy&amp;rsquo;s Brain Blog&lt;/a&gt; - A large collection of articles, tutorials, and videos, covering many of the popular neuroimaging tools and methods.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="moocs"&gt;MOOCs
&lt;/h3&gt;&lt;p&gt;&lt;a class="link" href="https://en.wikipedia.org/wiki/Massive_open_online_course" target="_blank" rel="noopener"
&gt;Massive Open Online Courses (MOOCs)&lt;/a&gt; are free Web-based distance learning programs that are designed for the participation of large numbers of geographically dispersed students.
MOOCs may be patterned on a college or university course or may be less structured.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.mcb80x.org/" target="_blank" rel="noopener"
&gt;The Fundamentals of Neuroscience | Harvard &amp;amp; edX&lt;/a&gt; - Serves an introductory survery of topics in neuroscience and has no specific prerequisites, though some prior exposure to biology and/or chemistry can be helpful.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://ocw.mit.edu/courses/brain-and-cognitive-sciences/9-01-introduction-to-neuroscience-fall-2007/" target="_blank" rel="noopener"
&gt;Introduction to Neuroscience | MIT OCW&lt;/a&gt; - Introduction to the mammalian nervous system, with emphasis on the structure and function of the human brain.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.coursera.org/learn/computational-neuroscience" target="_blank" rel="noopener"
&gt;Computational Neuroscience | Coursera&lt;/a&gt; - Provides an introduction to basic computational methods for understanding what nervous systems do and for determining how they function.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.coursera.org/learn/medical-neuroscience" target="_blank" rel="noopener"
&gt;Medical Neuroscience&lt;/a&gt; - Explores the functional organization and neurophysiology of the human central nervous system, while providing a neurobiological framework for understanding human behavior.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/NeuromatchAcademy/course-content" target="_blank" rel="noopener"
&gt;Neuromatch Academy&lt;/a&gt; - Jupyter notebooks for the three-week intensive summer school in computational neuroscience.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="communities"&gt;Communities
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.quora.com/topic/Neuroscience-1" target="_blank" rel="noopener"
&gt;Quora&lt;/a&gt; - Neuroscience topic on Quora contains answers, often by experts, to questions ranging from basic to advanced.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.reddit.com/r/ScienceNetwork/comments/ptye0/link_tables/" target="_blank" rel="noopener"
&gt;Reddit&lt;/a&gt; - List of neuroscience, psychology and cognitive science subreddits.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://psychology.stackexchange.com" target="_blank" rel="noopener"
&gt;StackExchange&lt;/a&gt; - Psychology and neuroscience StackExchange site.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://mail.python.org/mailman/listinfo/neuroimaging" target="_blank" rel="noopener"
&gt;neuroimaging@python.org&lt;/a&gt; - A list for discussion of neuroimaging analysis in Python. Among other things, this list is home to discussions concerning &lt;a class="link" href="https://nipy.org/" target="_blank" rel="noopener"
&gt;NiPy&lt;/a&gt; projects (including NiBabel, Nilearn, dipy, MNE-Python, and more).&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="newsletters"&gt;Newsletters
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="http://neuro.hms.harvard.edu/harvard-mahoney-neuroscience-institute/hmni-newsletter" target="_blank" rel="noopener"
&gt;On The Brain&lt;/a&gt; - Harvard Mahoney Neuroscience Institute&amp;rsquo;s quarterly e-newsletter.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.tnb.ua.ac.be/mailman/listinfo/comp-neuro" target="_blank" rel="noopener"
&gt;Comp-neuro&lt;/a&gt; - A mailing list that is is intended to address the broad range of research approaches and issues involved in the general field of computational neuroscience.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.brainpost.co/" target="_blank" rel="noopener"
&gt;BrainPost&lt;/a&gt; - A mailing list that delivers weekly easy-to-read summaries of the latest neuroscience publications.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="miscellaneous"&gt;Miscellaneous
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/awesomedata/awesome-public-datasets#neuroscience" target="_blank" rel="noopener"
&gt;Awesome Public Datasets - Neuroscience&lt;/a&gt; - High-quality open neuroscience datasets.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://justinmeiners.github.io/neural-nets-sim/" target="_blank" rel="noopener"
&gt;McCulloch &amp;amp; Pitts Neural Net Simulator&lt;/a&gt; - Simulator for a historical computational model based on neurons.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://senselab.med.yale.edu/ModelDB/default.cshtml" target="_blank" rel="noopener"
&gt;ModelDB&lt;/a&gt; - Searchable database for computational neuroscience models.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://senselab.med.yale.edu/NeuronDB" target="_blank" rel="noopener"
&gt;NeuronDB&lt;/a&gt; - Searchable database for of three types of neuronal properties: voltage gated conductances, neurotransmitter receptors, and neurotransmitter substances.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://neuroelectro.org/" target="_blank" rel="noopener"
&gt;NeuroElectro&lt;/a&gt; - Searchable database of neurons and their electrophysiological properties (extracted from literature)&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://learn-anything.xyz/neuroscience" target="_blank" rel="noopener"
&gt;Neuroscience Mindmap&lt;/a&gt; - Interactive mindmap containing curated resources for anyone interested in learning neuroscience.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/PhABC/neuroSummerSchools" target="_blank" rel="noopener"
&gt;neuroSummerSchools&lt;/a&gt; - List of summer (and seasonal) summer schools in neuroscience and related fields.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://brainpodcast.com/" target="_blank" rel="noopener"
&gt;Brain Matters&lt;/a&gt; - Neuroscience podcast where real neuroscientists sit down and talk about the brain.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://neurohackademy.org/course_type/lectures/" target="_blank" rel="noopener"
&gt;NeuroHackademy&lt;/a&gt; - Summer school in neuroimaging and data science, held at the University of Washington eScience Institute. Lectures are available through the institute&amp;rsquo;s &lt;a class="link" href="https://www.youtube.com/@UWeScienceInstitute" target="_blank" rel="noopener"
&gt;YouTube channel&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/PTDZ/SORTED" target="_blank" rel="noopener"
&gt;SORTED&lt;/a&gt; - SORTED: a list of interesting science ideas and links (cognitive/neuro &amp;amp; data science)&lt;/li&gt;
&lt;/ul&gt;</description></item><item><title>GitHub Awesome Neuroimaging</title><link>https://hanguangwu.github.io/blog/en/p/github-awesome-neuroimaging/</link><pubDate>Thu, 29 Jan 2026 14:34:25 -0800</pubDate><guid>https://hanguangwu.github.io/blog/en/p/github-awesome-neuroimaging/</guid><description>&lt;h1 id="awesome-neuroimaging"&gt;Awesome Neuroimaging
&lt;/h1&gt;&lt;h2 id="introduction"&gt;Introduction
&lt;/h2&gt;&lt;blockquote&gt;
&lt;p&gt;Exploring, organizing, and analysing brain images and recordings. MR focused.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h2 id="viewers"&gt;Viewers
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://afni.nimh.nih.gov/pub/dist/doc/htmldoc/afniandafni/gui_guide/main_toc.html" target="_blank" rel="noopener"
&gt;AFNI&lt;/a&gt; - Volumetric viewer from the AFNI suit. GUI&amp;rsquo;s aesthetic defined by the &amp;rsquo;90s era Motif toolkit.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://surfer.nmr.mgh.harvard.edu/fswiki/FreeviewGuide/FreeviewIntroduction" target="_blank" rel="noopener"
&gt;freeview&lt;/a&gt; - Surface and volumetric image viewer in Freesurfer suit. Uses the QT toolkit.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/FSLeyes" target="_blank" rel="noopener"
&gt;fsleyes&lt;/a&gt; - Volumetric viewer from FSL.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.nitrc.org/projects/mricron" target="_blank" rel="noopener"
&gt;mricron&lt;/a&gt; - Volumetric viewer that works on windows.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://dsi-studio.labsolver.org/doc/gui_t1.html" target="_blank" rel="noopener"
&gt;dsistudio&lt;/a&gt; - DSI viewer from the dsi-studio suit of tools.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.osirix-viewer.com/" target="_blank" rel="noopener"
&gt;osirix&lt;/a&gt; - DICOM database organizer and viewer.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://mangoviewer.com/" target="_blank" rel="noopener"
&gt;Mango&lt;/a&gt; - Multi-image Analysis GUI is a viewer for medical research images for dcm, nii, surface, etc; version 4.1 released in 2019.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.humanconnectome.org/software/connectome-workbench" target="_blank" rel="noopener"
&gt;&lt;code&gt;wb_view&lt;/code&gt;&lt;/a&gt; - Connectome workbench surface file viewer.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="acquisition"&gt;Acquisition
&lt;/h2&gt;&lt;h3 id="mr"&gt;MR
&lt;/h3&gt;&lt;h4 id="organization"&gt;Organization
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://dbic-handbook.readthedocs.io/en/latest/mri/reproin.html" target="_blank" rel="noopener"
&gt;reproIn&lt;/a&gt; - Naming exam card sequences.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://bids-specification.readthedocs.io/en/stable/" target="_blank" rel="noopener"
&gt;BIDS&lt;/a&gt; - Brain Imaging Data Structure - directory hierarchy and file naming specification.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="management"&gt;Management
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://en.wikipedia.org/wiki/Picture_archiving_and_communication_system" target="_blank" rel="noopener"
&gt;PACS&lt;/a&gt; - Picture Archiving and Communication System standard used to store and transfer DICOM images from medical equipment and likely implemented by scanner manufacture. See &lt;a class="link" href="https://www.siemens-healthineers.com/en-us/digital-health-solutions/syngo-carbon" target="_blank" rel="noopener"
&gt;Siemens Healthineers Syngo Carbon&lt;/a&gt;, &lt;a class="link" href="https://www.documents.philips.com/assets/20240227/5a788a79bbdd4e1986f1b12300b0e534.pdf" target="_blank" rel="noopener"
&gt;Phillips Vue PACS&lt;/a&gt;, &lt;a class="link" href="https://www.gehealthcare.com/products/healthcare-it/true-pacs" target="_blank" rel="noopener"
&gt;GE HealthCare True PACS&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.xnat.org/" target="_blank" rel="noopener"
&gt;XNAT&lt;/a&gt; - An extensible open-source imaging informatics software platform dedicated to imaging-based research.
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/VUIIS/dax" target="_blank" rel="noopener"
&gt;DAX&lt;/a&gt; - Distributed Automation for XNAT: use containerization w/YAML defined input/output.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://mcin.ca/technology/loris/" target="_blank" rel="noopener"
&gt;LORIS&lt;/a&gt; - LORIS (Longitudinal Online Research and Imaging System) is web-based data and project management software for neuroimaging research studies.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://brainlife.io" target="_blank" rel="noopener"
&gt;brainlife.io&lt;/a&gt; - Open-source, free and secure reproducible neuroscience analysis platform.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://mcin.ca/technology/cbrain/" target="_blank" rel="noopener"
&gt;cbrain&lt;/a&gt; - CBRAIN is web-based software that allows neuroimaging researchers to perform computationally intensive analyses on data by connecting them to High-Performance Computing (HPC).&lt;/li&gt;
&lt;li&gt;💲&lt;a class="link" href="https://flywheel.io" target="_blank" rel="noopener"
&gt;Flywheel&lt;/a&gt; - A cloud-based imaging research data platform for data capture, curating, automation, and machine learning.&lt;/li&gt;
&lt;li&gt;💲&lt;a class="link" href="https://qmenta.com" target="_blank" rel="noopener"
&gt;QMENTA&lt;/a&gt; - The all in one imaging platform for your clinical trial.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="motion"&gt;Motion
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://firmm.readthedocs.io" target="_blank" rel="noopener"
&gt;FIRMM&lt;/a&gt; - Real-time motion monitoring for fMRI, diffusion, and navigated T1/T2 image acquisition.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://afni.nimh.nih.gov/pub/dist/doc/program_help/Dimon.html" target="_blank" rel="noopener"
&gt;&lt;code&gt;Dimon&lt;/code&gt;&lt;/a&gt; - Monitor real-time acquisition of DICOM image files with AFNI.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="quality-assurance-and-checking"&gt;Quality Assurance and Checking
&lt;/h2&gt;&lt;p&gt;QA and QC of scanner images.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://mriqc.readthedocs.io/" target="_blank" rel="noopener"
&gt;MRIQC&lt;/a&gt; - Extracts no-reference IQMs (image quality metrics) from structural (T1w and T2w) and functional MRI (magnetic resonance imaging) data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/Open-Minds-Lab/mrQA" target="_blank" rel="noopener"
&gt;mrQA&lt;/a&gt; - Tools for quality assurance in medical imaging datasets, including protocol compliance.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/bids-standard/bids-validator/" target="_blank" rel="noopener"
&gt;bids-validator&lt;/a&gt; - Validator for the Brain Imaging Data Structure.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="pipelines"&gt;Pipelines
&lt;/h2&gt;&lt;p&gt;Preprocessing workflows.&lt;/p&gt;
&lt;h3 id="suites"&gt;Suites
&lt;/h3&gt;&lt;p&gt;Software packages for multiple modalities, often offering a graphical user interface.&lt;/p&gt;
&lt;!--lint ignore double-link--&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://afni.nimh.nih.gov/" target="_blank" rel="noopener"
&gt;AFNI&lt;/a&gt; - Analysis of Functional NeuroImages is a leading software suite of C, Python, R programs and shell scripts primarily developed for the analysis and display of multiple MRI modalities.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://fsl.fmrib.ox.ac.uk/fsl/fslwiki" target="_blank" rel="noopener"
&gt;FSL&lt;/a&gt; - A comprehensive library of analysis tools for FMRI, MRI and diffusion brain imaging data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.fil.ion.ucl.ac.uk/spm/" target="_blank" rel="noopener"
&gt;SPM&lt;/a&gt; - Statistical Parametric Mapping refers to the construction and assessment of spatially extended statistical processes used to test hypotheses about functional imaging data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://qunex.yale.edu/" target="_blank" rel="noopener"
&gt;Qu|Nex&lt;/a&gt; - The Quantitative Neuroimaging Environment &amp;amp; Toolbox (QuNex) is an open-source software suite that collectively supports an extensible framework for data organization, preprocessing, quality assurance, and analyses across neuroimaging modalities.
&lt;!-- - [MINC](https://mcin.ca/technology/minc/) -- more of a format than suit? --&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="bold"&gt;BOLD
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://fmriprep.org/" target="_blank" rel="noopener"
&gt;fmriprep&lt;/a&gt; - Accessible preprocessing pipeline robust to variations in scan acquisition protocols with comprehensive error and output reporting. Input is BIDS dataset.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://afni.nimh.nih.gov/pub/dist/doc/program_help/afni_proc.py.html" target="_blank" rel="noopener"
&gt;&lt;code&gt;afni_proc.py&lt;/code&gt;&lt;/a&gt; - Best practice pipelines with pre-configured blocks using AFNI.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/HALFpipe/HALFpipe" target="_blank" rel="noopener"
&gt;HALFpipe&lt;/a&gt; - User-friendly software that facilitates reproducible analysis of fMRI data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://xcp-d.readthedocs.io/en/latest/" target="_blank" rel="noopener"
&gt;XCP-D&lt;/a&gt; - Post-processing and noise regression pipeline picks up right where fMRIprep ends.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://clpipe.readthedocs.io/en/latest/" target="_blank" rel="noopener"
&gt;clpipe&lt;/a&gt; - Uses fmriprep for preprocessing fMRI data and implements a variety of additional processing steps important for functional connectivity analyses such as nuisance regression and filtering.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/LabNeuroCogDevel/fmri_processing_scripts" target="_blank" rel="noopener"
&gt;fmri_processing_scripts&lt;/a&gt; - Legacy pipeline for maximal preprocessing.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.humanconnectome.org/software/hcp-mr-pipelines" target="_blank" rel="noopener"
&gt;HCP Pipeline&lt;/a&gt; - Pipeline scripts implement the Minimal Preprocessing Pipeline (MPP) described in &lt;a class="link" href="http://www.ncbi.nlm.nih.gov/pubmed/23668970" target="_blank" rel="noopener"
&gt;Glasser et al. 2013&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="dsi"&gt;DSI
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://dsi-studio.labsolver.org/" target="_blank" rel="noopener"
&gt;dsi-studio&lt;/a&gt; - A tractography software tool that maps brain connections and correlates findings with neuropsychological disorders.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://qsiprep.readthedocs.io/" target="_blank" rel="noopener"
&gt;qsiprep&lt;/a&gt; - Configures pipelines for processing diffusion-weighted MRI (dMRI) data.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="structural"&gt;Structural
&lt;/h3&gt;&lt;!--lint ignore double-link--&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://freesurfer.net/" target="_blank" rel="noopener"
&gt;Freesurfer&lt;/a&gt; - An open source neuroimaging toolkit for processing, analyzing, and visualizing human brain MR images.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://mcin.ca/technology/civet/" target="_blank" rel="noopener"
&gt;CIVET&lt;/a&gt; - An image processing pipeline for fully automated volumetric, corticometric, and morphometric analysis of human brain imaging data.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="raw-data"&gt;Raw Data
&lt;/h2&gt;&lt;p&gt;Dealing with DICOM and k-space images&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/rordenlab/dcm2niix" target="_blank" rel="noopener"
&gt;dcm2niix&lt;/a&gt; - DICOM to NIfTI converter.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/nipy/heudiconv/" target="_blank" rel="noopener"
&gt;heudiconv&lt;/a&gt; - A flexible DICOM converter for organizing brain imaging data into structured directory layouts.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://sourceforge.net/projects/gdcm/" target="_blank" rel="noopener"
&gt;gdcm&lt;/a&gt; - Grassroots DICOM is a C++ library and CLI tool for DICOM medical files.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://pydicom.github.io/" target="_blank" rel="noopener"
&gt;&lt;code&gt;pydicom&lt;/code&gt;&lt;/a&gt; - Python package and cli tool for inspecting, modifying, and creating DICOM files.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://afni.nimh.nih.gov/pub/dist/doc/program_help/dicom_hinfo.html" target="_blank" rel="noopener"
&gt;&lt;code&gt;dicom_hinfo&lt;/code&gt;&lt;/a&gt;, &lt;code&gt;dicom_hdr&lt;/code&gt; - Prints selected information from the DICOM file.
&lt;!--lint ignore double-link--&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://lncd.github.io/lncdtools/BIDS/" target="_blank" rel="noopener"
&gt;&lt;code&gt;dcmdirtab&lt;/code&gt;, &lt;code&gt;dcmtab_bids&lt;/code&gt;&lt;/a&gt; - CLI focused, regular expression based, and iteration friendly BIDS conversion pipeline from &lt;a class="link" href="https://github.com/lncd/lncdtools/" target="_blank" rel="noopener"
&gt;lncdtools&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://git.fmrib.ox.ac.uk/wclarke/pymapvbvd" target="_blank" rel="noopener"
&gt;pymapVBVD&lt;/a&gt; - Reads Siemens .dat &amp;rsquo;twix&amp;rsquo; raw data files. Python port of Philipp Ehses&amp;rsquo; Matlab tool mapVBVD.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/FNNDSC/med2image/pulls" target="_blank" rel="noopener"
&gt;med2image&lt;/a&gt; - Python CLI tool for generating jpg or png images from DICOM or nifti files.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="provenance-and-automation"&gt;Provenance and Automation
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.frontiersin.org/articles/10.3389/fninf.2016.00002/full" target="_blank" rel="noopener"
&gt;make&lt;/a&gt; - follow script recipes defined in &lt;code&gt;Makefile&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/datalad/datalad" target="_blank" rel="noopener"
&gt;datalad&lt;/a&gt; - Keep code, data, containers under control with git and git-annex. Esp &lt;code&gt;datalad run --input=... --output=...&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://afni.nimh.nih.gov/pub/dist/doc/program_help/3dNotes.html" target="_blank" rel="noopener"
&gt;&lt;code&gt;3dNotes&lt;/code&gt;&lt;/a&gt; - A program to add, delete and show notes for AFNI datasets.
&lt;!--lint ignore double-link--&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/lncd/lncdtools/blob/master/niinote" target="_blank" rel="noopener"
&gt;&lt;code&gt;niinote&lt;/code&gt;&lt;/a&gt; - Add AFNI nifti XML history to header to run and record any command. Part of &lt;a class="link" href="https://github.com/lncd/lncdtools/" target="_blank" rel="noopener"
&gt;lncdtools&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="imaging-tools"&gt;Imaging Tools
&lt;/h2&gt;&lt;p&gt;Software to read, write, and manipulate volumetric and/or surface data.&lt;/p&gt;
&lt;h3 id="skullstripping"&gt;Skullstripping
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://montilab.psych.ucla.edu/fmri-wiki/optibet/" target="_blank" rel="noopener"
&gt;optibet&lt;/a&gt; - Shell script to combine afni and fsl tools for more robust skull stripping in patient populations.
&lt;!--lint ignore double-link--&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://afni.nimh.nih.gov/pub/dist/doc/program_help/3dSkullStrip.html" target="_blank" rel="noopener"
&gt;&lt;code&gt;3dSkullStrip&lt;/code&gt;&lt;/a&gt; - &lt;a class="link" href="https://afni.nimh.nih.gov/" target="_blank" rel="noopener"
&gt;AFNI&lt;/a&gt;&amp;rsquo;s skull stripping utility with many parameters.
&lt;!--lint ignore double-link--&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/BET/UserGuide" target="_blank" rel="noopener"
&gt;&lt;code&gt;bet&lt;/code&gt;&lt;/a&gt; - &lt;a class="link" href="https://fsl.fmrib.ox.ac.uk/fsl/fslwiki" target="_blank" rel="noopener"
&gt;FSL&lt;/a&gt;&amp;rsquo;s brain extraction tool.
&lt;!--lint ignore double-link--&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://dpaniukov.github.io/2016/06/06/brain-extraction-with-ants.html" target="_blank" rel="noopener"
&gt;&lt;code&gt;antsBrainExtraction.sh&lt;/code&gt;&lt;/a&gt; - &lt;a class="link" href="http://stnava.github.io/ANTs/" target="_blank" rel="noopener"
&gt;ANTs&lt;/a&gt; version.
&lt;!--lint ignore double-link--&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://surfer.nmr.mgh.harvard.edu/fswiki/mri_watershed" target="_blank" rel="noopener"
&gt;&lt;code&gt;mri-watershed&lt;/code&gt;&lt;/a&gt; - Part of the &lt;a class="link" href="https://freesurfer.net/" target="_blank" rel="noopener"
&gt;Freesurfer&lt;/a&gt; pipeline.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.nitrc.org/projects/robex" target="_blank" rel="noopener"
&gt;ROBEX&lt;/a&gt; - Robust Brain Extraction without parameter tweaking.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="warping"&gt;Warping
&lt;/h3&gt;&lt;p&gt;Spatial normalization&lt;/p&gt;
&lt;!--lint ignore double-link--&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="http://stnava.github.io/ANTs/" target="_blank" rel="noopener"
&gt;ANTs&lt;/a&gt; - Advanced Normalization Tools includes probabilistic tissue segmentation and machine learning methods based on expert labeled data to order to maximize reliability and consistency of multiple modality image segmentation.
&lt;!--lint ignore double-link--&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://afni.nimh.nih.gov/pub/dist/doc/program_help/3dQwarp.html" target="_blank" rel="noopener"
&gt;3dQwarp&lt;/a&gt; - OpenMP parallelized &lt;a class="link" href="https://afni.nimh.nih.gov/" target="_blank" rel="noopener"
&gt;AFNI&lt;/a&gt; tool to compute a nonlinearly warped version of source dataset to match a base dataset.
&lt;!--lint ignore double-link--&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/FNIRT/UserGuide" target="_blank" rel="noopener"
&gt;flirt, fnirt&lt;/a&gt; - Warping software provided by &lt;a class="link" href="https://fsl.fmrib.ox.ac.uk/fsl/fslwiki" target="_blank" rel="noopener"
&gt;FSL&lt;/a&gt; tools.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="templates"&gt;Templates
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.templateflow.org/" target="_blank" rel="noopener"
&gt;templateflow&lt;/a&gt; - A modular, version-controlled resource that allows researchers to use templates &amp;ldquo;off-the-shelf&amp;rdquo; and share new ones.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.bic.mni.mcgill.ca/ServicesAtlases/ICBM152NLin2009" target="_blank" rel="noopener"
&gt;MNI152&lt;/a&gt; - Unbiased standard magnetic resonance imaging template brain volume for normal population.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="manipulation"&gt;Manipulation
&lt;/h3&gt;&lt;p&gt;Tools for doing math on matrix values&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://afni.nimh.nih.gov/pub/dist/doc/program_help/3dcalc.html" target="_blank" rel="noopener"
&gt;3dcalc&lt;/a&gt; - Voxel-by-voxel arithmetic on 1D to 4D datasets. From AFNI.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/Fslutils#:~:text=a%20combined%20image.-,fslmaths,--%20simple%20but%20powerful" target="_blank" rel="noopener"
&gt;fslmaths&lt;/a&gt; - Simple but powerful program to allow mathematical manipulation of images. From FSL.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.freesurfer.net/pub/dist/freesurfer/dev_binaries/centos6_x86_64/fscalc.fsl" target="_blank" rel="noopener"
&gt;fscalc&lt;/a&gt; - Freesurfer wrapper of fslmaths.
&lt;!--- Also see [#Libraries](#libraries) for development interfaces to be used within programming language.--&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="modeling"&gt;Modeling
&lt;/h3&gt;&lt;h4 id="hrf"&gt;HRF
&lt;/h4&gt;&lt;!--lint ignore double-link--&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://afni.nimh.nih.gov/pub/dist/doc/program_help/3dDeconvolve.html" target="_blank" rel="noopener"
&gt;3dDeconvolve&lt;/a&gt; - &lt;a class="link" href="https://afni.nimh.nih.gov/" target="_blank" rel="noopener"
&gt;AFNI&lt;/a&gt; - Program to calculate the deconvolution of a measurement 3D+time dataset with a specified input stimulus time series. This program can also perform multiple linear regression using multiple input stimulus time series.
&lt;!--lint ignore double-link--&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/FEAT" target="_blank" rel="noopener"
&gt;FEAT&lt;/a&gt; - GUI guided analysis of simple experiment based on general linear modeling. Part of &lt;a class="link" href="https://fsl.fmrib.ox.ac.uk/fsl/fslwiki" target="_blank" rel="noopener"
&gt;FSL&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="mrsi"&gt;MRSI
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/schorschinho/LCModel" target="_blank" rel="noopener"
&gt;lcmodel&lt;/a&gt; - Implements linear-combination modeling of magnetic resonance spectroscopy data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/FSL-MRS" target="_blank" rel="noopener"
&gt;FSL-MRS&lt;/a&gt; - A suite of tools for MR Spectroscopy, including single voxel (SVS), MRS imaging (MRSI), functional MRS (fMRS), diffusion MRS (dwMRS), edited spectroscopy, etc.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/schorschinho/osprey" target="_blank" rel="noopener"
&gt;Osprey&lt;/a&gt; - An all-in-one software suite for state-of-the art processing and quantitative analysis of in-vivo magnetic resonance spectroscopy (MRS) data.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="eeg"&gt;EEG
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://fooof-tools.github.io/fooof/index.html" target="_blank" rel="noopener"
&gt;&lt;code&gt;fooof&lt;/code&gt;&lt;/a&gt; - Fast, efficient, and physiologically-informed tool to parameterize neural power spectra.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="misc"&gt;Misc
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/elifesciences-publications/ei_hurst/" target="_blank" rel="noopener"
&gt;hurst&lt;/a&gt; - Algorithm to assess intrinsic excitation-inhibition imbalance in MR (&lt;a class="link" href="http://doi.org/10.7554/eLife.55684" target="_blank" rel="noopener"
&gt;Trakoshis et al, eLife, 2020&lt;/a&gt;).
&lt;!--lint ignore double-link--&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://lncd.github.io/lncdtools/tat2/" target="_blank" rel="noopener"
&gt;tat2&lt;/a&gt; - Time-averaged T2* wrapper script using AFNI binaries from &lt;a class="link" href="https://github.com/lncd/lncdtools/" target="_blank" rel="noopener"
&gt;lncdtools&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="libraries"&gt;Libraries
&lt;/h2&gt;&lt;h3 id="python"&gt;Python
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://nipy.org/" target="_blank" rel="noopener"
&gt;nipy&lt;/a&gt; - Includes &lt;code&gt;nibabel&lt;/code&gt;, &lt;code&gt;nipype&lt;/code&gt;, and &lt;code&gt;nilearn&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="r"&gt;R
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/bjw34032/oro.nifti" target="_blank" rel="noopener"
&gt;oro.nifti&lt;/a&gt; - Functions for the input/output and visualization of medical imaging data that follow either the ANALYZE, NIfTI or AFNI formats.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="matlab"&gt;MATLAB
&lt;/h3&gt;&lt;!--lint ignore double-link--&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.fil.ion.ucl.ac.uk/spm/" target="_blank" rel="noopener"
&gt;SPM&lt;/a&gt; - Statistical Parametric Mapping refers to the construction and assessment of spatially extended statistical processes used to test hypotheses about functional imaging data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/tanguyduval/imtool3D_td" target="_blank" rel="noopener"
&gt;&lt;code&gt;imtool3D_td&lt;/code&gt;&lt;/a&gt; - 3D Image Viewer with ROI tools for Matlab (NIFTI viewer, Manual segmentation).&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="resources"&gt;Resources
&lt;/h2&gt;&lt;h3 id="blogs-books-and-docs"&gt;Blogs, Books, and Docs
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://andysbrainblog.com/" target="_blank" rel="noopener"
&gt;Andy&amp;rsquo;s brain blog&lt;/a&gt; - Tutorials and videos about neuroimaging analysis from start to finish in all the major software packages (AFNI, SPM, and FSL).&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://handbook.datalad.org/" target="_blank" rel="noopener"
&gt;DataLad handbook&lt;/a&gt; - Start-to-end use cases of specific applications in neuroimaging using provenance tracking software &lt;code&gt;datalad&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://learn-neuroimaging.github.io/hitchhackers_guide_brain/" target="_blank" rel="noopener"
&gt;Hitchhacker&amp;rsquo;s guide to the brain&lt;/a&gt; - Notes from study planning to reporting and data sharing by way of acquisition, processing, analysis, and quality control.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/Remi-Gau/online_neuroimaging_resources" target="_blank" rel="noopener"
&gt;Online Neuroimaging Resources&lt;/a&gt; - A laundry list of online resources for MRI, fMRI, EEG, MEG.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://neuroimaging-core-docs.readthedocs.io/" target="_blank" rel="noopener"
&gt;U of A: Neuroimaging Core Documentation&lt;/a&gt; - Documentation for approaches used and/or developed by the neuroimaging core at the University of Arizona.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="boards-and-chats"&gt;Boards And Chats
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://neurostars.org/" target="_blank" rel="noopener"
&gt;neurostars&lt;/a&gt; - General neuroimaging &lt;code&gt;discuss&lt;/code&gt; form. &lt;code&gt;fmriprep&lt;/code&gt; suggested Q&amp;amp;A site.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://discuss.afni.nimh.nih.gov" target="_blank" rel="noopener"
&gt;afni discuss&lt;/a&gt; - AFNI&amp;rsquo;s &lt;code&gt;discuss&lt;/code&gt; instance.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://mattermost.brainhack.org/" target="_blank" rel="noopener"
&gt;brainhack&lt;/a&gt; - A mattermoust community of neuroimagers.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="datasets-repositories"&gt;Datasets Repositories
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://openneuro.org/" target="_blank" rel="noopener"
&gt;openneuro&lt;/a&gt; - A free and open platform for validating and sharing BIDS-compliant MRI, PET, MEG, EEG, and iEEG data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://nda.nih.gov/" target="_blank" rel="noopener"
&gt;NDA&lt;/a&gt; - National Institute of Mental Health Data Archive (NDA) makes available human subjects data collected from hundreds of research projects across many scientific domains.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.nitrc.org/" target="_blank" rel="noopener"
&gt;NITRC&lt;/a&gt; - NeuroImaging Tools &amp;amp; Resources Collaboratory library of neuroinformatics software and data.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="big-datasets"&gt;Big datasets
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://abcdstudy.org/" target="_blank" rel="noopener"
&gt;ABCD&lt;/a&gt; - Long-term Adolescent Brain Cognitive Development study including 1000s of longitudinal MRI scans.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.ukbiobank.ac.uk/" target="_blank" rel="noopener"
&gt;UK Biobank&lt;/a&gt; - A large-scale biomedical database and research resource with 500,000 research participants.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="http://www.ncanda.org/" target="_blank" rel="noopener"
&gt;NCANDA&lt;/a&gt; - National Consortium on Alcohol and Neurodevelopment in Adolescence (4000+ MR visits).&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.med.upenn.edu/bbl/philadelphianeurodevelopmentalcohort.html" target="_blank" rel="noopener"
&gt;PNC&lt;/a&gt; - A population-based sample of over 9500 individuals from the greater Philadelphia area, ages 8-21 years who received medical care at the CHOP network.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://enigma.ini.usc.edu/" target="_blank" rel="noopener"
&gt;ENIGMA&lt;/a&gt; - The Enhancing Neuro Imaging Genetics through Meta Analysis Consortium contains 50 working group&amp;rsquo;s imaging and genomics data.&lt;/li&gt;
&lt;/ul&gt;</description></item><item><title>GitHub Awesome Stock Trading</title><link>https://hanguangwu.github.io/blog/en/p/github-awesome-stock-trading/</link><pubDate>Thu, 29 Jan 2026 14:34:25 -0800</pubDate><guid>https://hanguangwu.github.io/blog/en/p/github-awesome-stock-trading/</guid><description>&lt;h1 id="awesome-stock-trading"&gt;Awesome Stock Trading
&lt;/h1&gt;&lt;h2 id="introduction"&gt;Introduction
&lt;/h2&gt;&lt;p&gt;The curated list of resources for research and learning about stock trading and investing. It contains links to various resources and tools that can help anyone who wants to start or improve their stock trading skills. The project aims to be a comprehensive and useful list for anyone interested in stock trading.&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id="stock-research"&gt;Stock Research
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://alphaspread.com" target="_blank" rel="noopener"
&gt;Alpha Spread&lt;/a&gt; - Provides data and tools for quantitative research and stock valuation.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.barchart.com" target="_blank" rel="noopener"
&gt;Barchart&lt;/a&gt; - Offers market data, analysis, and tools for commodity, stock, and forex traders.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.capitoltrades.com" target="_blank" rel="noopener"
&gt;Capitol Trade&lt;/a&gt; - Keeping Tabs on Politicians&amp;rsquo; Trades (US only).&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://chartmill.com" target="_blank" rel="noopener"
&gt;Chartmill&lt;/a&gt; - ChartMill is a stock screening and analysis platform.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://danelfin.com" target="_blank" rel="noopener"
&gt;Danelfin&lt;/a&gt; - Provides AI-Powered Stock Research &amp;amp; Picking Tools.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://finbox.io/" target="_blank" rel="noopener"
&gt;Finbox&lt;/a&gt; - Offers tools for financial analysis, valuation, and screening of stocks.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.marketbeat.com" target="_blank" rel="noopener"
&gt;Market Beat&lt;/a&gt; - Provides stock research, ratings, and news for individual investors.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.marketscreener.com" target="_blank" rel="noopener"
&gt;Market Screener&lt;/a&gt; - Offers stock market quotes, news, analysis, and screening tools.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.morningstar.com" target="_blank" rel="noopener"
&gt;Morningstar&lt;/a&gt; - Provides investment research, ratings, and tools for stocks, mutual funds, and ETFs.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://seekingalpha.com" target="_blank" rel="noopener"
&gt;Seeking Alpha&lt;/a&gt; - Offers market news and analysis, portfolio management tools, and investment ideas from contributors.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://simplywall.st/" target="_blank" rel="noopener"
&gt;Simply Wall St&lt;/a&gt; - Simply Wall St. has a unique pictorial approach to quickly and effectively cut through the massive amounts of data to narrow to a select few candidates.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://strike.market" target="_blank" rel="noopener"
&gt;Strike.Market&lt;/a&gt; - Offers a platform for trading options and derivatives on cryptocurrency markets.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.tipranks.com" target="_blank" rel="noopener"
&gt;Tip Ranks&lt;/a&gt; - Provides ratings and analysis of stocks and financial experts based on their historical performance.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.wallstreetzen.com" target="_blank" rel="noopener"
&gt;Wall Street Zen&lt;/a&gt; - Offers tools for financial analysis, screening, and backtesting of investment strategies.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://wallmine.com" target="_blank" rel="noopener"
&gt;Wallmine&lt;/a&gt; - Provides stock analysis, screening, and news for individual investors.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.zacks.com" target="_blank" rel="noopener"
&gt;Zacks&lt;/a&gt; - Provides research, analysis, and ratings for stocks and funds based on quantitative models and fundamental data.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="market-analysis"&gt;Market Analysis
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.companiesmarketcap.com" target="_blank" rel="noopener"
&gt;Companies Market Cap&lt;/a&gt; - Provides a list of companies and their market capitalizations, allowing users to easily track the valuations of various publicly traded companies. It also includes useful data such as industry classifications and stock exchange listings.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://edition.cnn.com/markets/fear-and-greed" target="_blank" rel="noopener"
&gt;Fear &amp;amp; Greed Index&lt;/a&gt; - Provides a market sentiment indicator for investors. It analyzes seven different indicators, including market volatility, investor sentiment, and safe-haven demand, to generate a score ranging from 0-100 that reflects whether the market is in a state of fear or greed.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.tradingterminal.com" target="_blank" rel="noopener"
&gt;Trading Terminal&lt;/a&gt; - Overview of the most important metrics for the US Market.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://tradytics.com" target="_blank" rel="noopener"
&gt;Tradytics&lt;/a&gt; - AI predictions, intraday market price action, biggest movers, sectors performance, and more.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="stock-screener"&gt;Stock Screener
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.cnbc.com/stock-screener/" target="_blank" rel="noopener"
&gt;Cnbc Stock Screener&lt;/a&gt; - Stock screener for US stock market.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://finviz.com" target="_blank" rel="noopener"
&gt;Finviz&lt;/a&gt; - Free stock screener with financial visualizations.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.marketbeat.com/stock-screener/" target="_blank" rel="noopener"
&gt;Market Beat Stock Screener&lt;/a&gt; - US stock market screener that allows filtering stocks by market cap, sector, industry, and more.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://simplywall.st/features/stock-screener" target="_blank" rel="noopener"
&gt;Simply Wall St Stock Screener&lt;/a&gt; - Global stock screener that allows filtering stocks by market cap, sector, industry, and more.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.tipranks.com/screener" target="_blank" rel="noopener"
&gt;Tip Ranks Stock Screener&lt;/a&gt; - International stock market screener that allows filtering stocks by market cap, sector, industry, and more.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://scanner.tradingterminal.com" target="_blank" rel="noopener"
&gt;Trading Terminal Scanner&lt;/a&gt; - US market stock screener that allows filtering stocks by market cap, sector, industry, and more.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://wallmine.com/screener" target="_blank" rel="noopener"
&gt;Wallmine Free Stock Screener&lt;/a&gt; - Free stock screener that allows filtering stocks by market cap, sector, industry, and more.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.wallstreetzen.com/stock-screener" target="_blank" rel="noopener"
&gt;Wallstreet Zen Stock Screener&lt;/a&gt; - Free US stock market screener that allows filtering stocks by market cap, sector, industry, and more.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.zacks.com/stock-screener" target="_blank" rel="noopener"
&gt;Zacks Stock Screener&lt;/a&gt; - US stock market screener that allows filtering stocks by market cap, sector, industry, and more.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="charting"&gt;Charting
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.tradingview.com" target="_blank" rel="noopener"
&gt;TradingView&lt;/a&gt; - Platform that offers charting tools, trading ideas, and real-time market data for stocks, forex, cryptocurrencies, and other financial instruments.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://stockcharts.com" target="_blank" rel="noopener"
&gt;StockCharts&lt;/a&gt; - Technical analysis and charting website that provides advanced charting tools, custom indicators, and market analysis for stocks, funds, and indices.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="news"&gt;News
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.barrons.com" target="_blank" rel="noopener"
&gt;Barron&amp;rsquo;s&lt;/a&gt; - Financial magazine that provides news, analysis, and insights on the stock market, investing, and personal finance.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.benzinga.com" target="_blank" rel="noopener"
&gt;Benzinga&lt;/a&gt; - Financial news and data provider that delivers real-time market updates, stock analysis, and investment ideas.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.bloomberg.com" target="_blank" rel="noopener"
&gt;Bloomberg&lt;/a&gt; - Financial news and information company that covers business, markets, politics, and technology.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.investing.com" target="_blank" rel="noopener"
&gt;Investing&lt;/a&gt; - Online platform that offers financial news, real-time quotes, and analysis on stocks, currencies, commodities, and other investments.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.marketwatch.com" target="_blank" rel="noopener"
&gt;MarketWatch&lt;/a&gt; - Financial news website that provides business news, analysis, and stock market data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.fool.com" target="_blank" rel="noopener"
&gt;The Motley Fool&lt;/a&gt; - Investment website that provides stock market analysis, investing ideas, and personal finance advice.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.wsj.com" target="_blank" rel="noopener"
&gt;The Wall Street Journal&lt;/a&gt; - A business-focused newspaper that covers global news, markets, and economics.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.thestreet.com" target="_blank" rel="noopener"
&gt;The Street&lt;/a&gt; - Financial news and investing website that offers stock market analysis, investment strategies, and personal finance advice.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://finance.yahoo.com" target="_blank" rel="noopener"
&gt;Yahoo Finance&lt;/a&gt; - Financial news and data website that provides real-time stock quotes, financial news, and investment analysis.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="commentaries"&gt;Commentaries
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://am.jpmorgan.com/us/en/asset-management/adv/insights/market-insights/market-updates/" target="_blank" rel="noopener"
&gt;J.P. Morgan - Market Updates&lt;/a&gt; - Weekly commentaries to get market insights from J.P. Morgan.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.msci.com/research-and-insights/market-insights" target="_blank" rel="noopener"
&gt;MSCI - Market Insights&lt;/a&gt; - Market commentaries and research reports with a focus on macroeconomic topics.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="portfolio-tracker"&gt;Portfolio Tracker
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://portfoliovisualizer.com" target="_blank" rel="noopener"
&gt;Portfolio Visualizer&lt;/a&gt; - Portfolio management and analysis tool that provides portfolio optimization, backtesting, and risk analysis.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.wealthica.com" target="_blank" rel="noopener"
&gt;Wealthica&lt;/a&gt; - Wealth management platform that provides portfolio management, financial planning, and investment research.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="strategy-backtesting"&gt;Strategy Backtesting
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.screeningtale.com" target="_blank" rel="noopener"
&gt;Screening Tale&lt;/a&gt; - Backtesting platform that allows users to test their trading strategies on historical data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.quantconnect.com" target="_blank" rel="noopener"
&gt;QuantConnect&lt;/a&gt; - Algorithmic trading platform that provides backtesting, live trading, and research tools for stocks, forex, and cryptocurrencies.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="stock-picks"&gt;Stock Picks
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://seekingalpha.com/alpha-picks/" target="_blank" rel="noopener"
&gt;Alpha Picks&lt;/a&gt; - Alpha Picks gives you two top stock picks each month, sifted from Seeking Alpha analysis of thousands of stocks.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://pro.benzinga.com" target="_blank" rel="noopener"
&gt;Benzinga Pro&lt;/a&gt; - Benzinga offers daily trade picks from professional day traders with on-demand support, as well as exclusive market-moving stories.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.investopedia.com/best-stocks-to-buy-now/" target="_blank" rel="noopener"
&gt;Best Stocks to Buy Now&lt;/a&gt; - Investopedia list of the best stocks to buy now.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.earningsbeats.com" target="_blank" rel="noopener"
&gt;EarningsBeats&lt;/a&gt; - EarningsBeats.com provides a research and educational platform for investors. Services are designed to help their members beat the S&amp;amp;P 500.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://seekingalpha.com/groups" target="_blank" rel="noopener"
&gt;Investing Groups by Seeking Alpha&lt;/a&gt; - Investing groups on Seeking Alpha that provide stock picks, analysis, and market insights.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.fool.com/services/" target="_blank" rel="noopener"
&gt;The Motley Fool Stock Advisor&lt;/a&gt; - Stock Advisor is a premium service that provides stock picks, analysis, and market insights.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="stock-collections"&gt;Stock Collections
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://marketsmith.investors.com/growth250/" target="_blank" rel="noopener"
&gt;Growth 250&lt;/a&gt; - MarketSmith&amp;rsquo;s Growth 250 is a curated list of high-potential stocks.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://stocktwits.com/rankings/trending" target="_blank" rel="noopener"
&gt;StockTwits Top 10&lt;/a&gt; - StockTwits&amp;rsquo; list of the top 10 trending stocks.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="stock-apis"&gt;Stock APIs
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.alphavantage.co/" target="_blank" rel="noopener"
&gt;Alpha Vantage&lt;/a&gt; - Alpha Vantage offers free APIs for realtime and historical stock data, forex, and cryptocurrency data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://eodhistoricaldata.com" target="_blank" rel="noopener"
&gt;Eodhistoricaldata&lt;/a&gt; - Eodhistoricaldata offers APIs for realtime and historical stock data, forex, and cryptocurrency data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://site.financialmodelingprep.com/" target="_blank" rel="noopener"
&gt;Financial Modeling Prep&lt;/a&gt; - Financial Modeling Prep API provides real time stock price, company financial statements, major index prices, stock historical data, forex real time rate and cryptocurrencies.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://marketstack.com" target="_blank" rel="noopener"
&gt;MarketStack&lt;/a&gt; - MarketStack offers APIs for realtime and historical stock data, forex, and cryptocurrency data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://developer.morningstar.com" target="_blank" rel="noopener"
&gt;Morningstar&lt;/a&gt; - Provides data, research, and reports.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://data.nasdaq.com" target="_blank" rel="noopener"
&gt;Nasdaq Data Link&lt;/a&gt; - Nasdaq Data Link offers a premier source for financial, economic and alternative datasets.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://massive.com/" target="_blank" rel="noopener"
&gt;Massive&lt;/a&gt; - Massive offers APIs for realtime and historical stock data, forex, and cryptocurrency data.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.refinitiv.com/en/products/eikon-trading-software/eikon-app-api-innovation/eikon-data-api" target="_blank" rel="noopener"
&gt;Refinitiv Eikon Data&lt;/a&gt; - The Eikon Data API allows applications to access data directly from Eikon or Refinitv Workspace.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="knowledge"&gt;Knowledge
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.investopedia.com" target="_blank" rel="noopener"
&gt;Investopedia&lt;/a&gt; - Investopedia.com is a website that provides educational content, news, analysis, and tools related to investing, finance, and business.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.stockscreening101.com" target="_blank" rel="noopener"
&gt;StockScreening101&lt;/a&gt; - StockScreening101 is a website that provides educational content, news, analysis, and tools related to investing, finance, and business.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.wallstreetmojo.com" target="_blank" rel="noopener"
&gt;Wallstreetmojo&lt;/a&gt; - Learn Investment Banking, Finance Modeling and Excel with more than 4800+ Articles, Self Study Guides, Resources and courses.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="books"&gt;Books
&lt;/h2&gt;&lt;h4 id="value-investing-and-fundamental-analysis"&gt;Value Investing and Fundamental Analysis
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/22393486-berkshire-hathaway-letters-to-shareholders" target="_blank" rel="noopener"
&gt;Berkshire Hathaway Letters to Shareholders&lt;/a&gt; - Warren Buffett, 2016 &lt;/br&gt;
For nearly six decades, Warren Buffett has written an annual letter to his shareholders. The letters, written between 1965 and 2014, reveal the investor&amp;rsquo;s thoughts on investment strategy, share buybacks, corporate culture and much more.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/25586616-common-stocks-and-uncommon-profits-and-other-writings-paperback-jan-0" target="_blank" rel="noopener"
&gt;Common Stocks and Uncommon Profits&lt;/a&gt; - Philip A. Fisher, 1957 &lt;/br&gt;
This book is considered a classic and is used as part of several investment courses, such as Stanford Graduate School of Business. In the book, Fisher explains his basic views and approach to his investment strategies.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/21841022-damodaran-on-valuation" target="_blank" rel="noopener"
&gt;Damodaran on Valuation: Security Analysis for Investment and Corporate Finance&lt;/a&gt; - Aswath Damodaran, 1994 &lt;/br&gt;
Damodaran explains aspect of valuation, from the basics of estimating cash flows and discount rates to the principles for using multiples.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/746936.Margin_of_Safety" target="_blank" rel="noopener"
&gt;Margin of Safety: Risk-Averse Value Investing Strategies for the Thoughtful Investor&lt;/a&gt; - Seth Klarman, 1991 &lt;/br&gt;
Margin of Safety explains the key fundamentals and practices of value investing. He outlines what value investing looks like and where investors might find attractive opportunities.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://en.wikipedia.org/wiki/Security_Analysis_%28book%29" target="_blank" rel="noopener"
&gt;Security Analysis&lt;/a&gt; - Benjamin Graham, 1934 &lt;/br&gt;
This book laid the intellectual foundation for what would later be called value investing. The first edition was published in 1934, shortly after the Wall Street crash and start of the Great Depression.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/369708.The_Alchemy_of_Finance" target="_blank" rel="noopener"
&gt;The Alchemy of Finance&lt;/a&gt; - George Soros, 1987 &lt;/br&gt;
This book offers insight into the decision-making process of one of the most successful wealth managers, George Soros.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/500514.The_Dhandho_Investor" target="_blank" rel="noopener"
&gt;The Dhandho Investor: The Low-Risk Value Method to High Returns&lt;/a&gt; - Mohnish Pabrai, 2007 &lt;/br&gt;
Written by Mohnish Pabrai, an investor of Indian origin, the book explains his value investing approach using the Dhandho capital allocation framework.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://en.wikipedia.org/wiki/The_Intelligent_Investor" target="_blank" rel="noopener"
&gt;The Intelligent Investor&lt;/a&gt; - Benjamin Graham, 1949 &lt;/br&gt;
In this book, Grahame explains his investment principles and views regarding an investor&amp;rsquo;s mindset. He shows how investors analyze the actual performance of companies and advises to disregard the changing moods of the market.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/75893.The_Little_Book_of_Value_Investing" target="_blank" rel="noopener"
&gt;The Little Book of Value Investing&lt;/a&gt; - Christopher H. Browne, 2006 &lt;/br&gt;
Brown explains the basic approaches of the value investing philosophy, but without making any big promises about market success. He provides guidance on the use of simple metrics such as the price-earnings ratio.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/41211699-the-most-important-thing" target="_blank" rel="noopener"
&gt;The Most Important Thing&lt;/a&gt; - Howard Marks, 2011 &lt;/br&gt;
In this book, Howard Marks summarizes investment insights from his client memos and explains his investment philosophy.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/21949163-the-thoughtful-investor" target="_blank" rel="noopener"
&gt;The Thoughtful Investor&lt;/a&gt; - Basant Maheshwari, 2011 &lt;/br&gt;
Maheshwari, a renowned Indian investor, explains in his book topics of financial analysis, the analysis of individual sectors and the behavioral aspect of investing.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/209956.The_Warren_Buffett_Way" target="_blank" rel="noopener"
&gt;The Warren Buffett Way&lt;/a&gt; - Robert G. Hagstrom, 2007 &lt;/br&gt;
The book describes the business and investment principles of value investing according to Warren Buffett.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/293636.Value_Investing" target="_blank" rel="noopener"
&gt;Value Investing: From Graham to Buffett and Beyond&lt;/a&gt; - Bruce C. N. Greenwald, 2004 &lt;/br&gt;
Greenwald explains the basic techniques of value investing and, in this context, illustrates their application using profiles of successful investors.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="quantitative-investing-and-portfolio-management"&gt;Quantitative Investing and Portfolio Management
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/537529.Active_Portfolio_Management" target="_blank" rel="noopener"
&gt;Active Portfolio Management&lt;/a&gt; - Richard C. Grinold, Ronald Kahn, 1994 &lt;/br&gt;
In this book, Grinold and Kahn show how economics, econometrics, and operations research can be used to solve practical investment problems and identify profit opportunities.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/2825008-portfolio-selection" target="_blank" rel="noopener"
&gt;Portfolio Selection: Efficient Diversification of Investments&lt;/a&gt; - Harry M. Markowitz, 1968 &lt;/br&gt;
A comprehensive explanation of analysis and calculation methods to help investors find the best combinations of securities to match their requirements.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/264468.Quantitative_Equity_Portfolio_Management" target="_blank" rel="noopener"
&gt;Quantitative Equity Portfolio Management&lt;/a&gt; - Ludwig B. Chincarini, Daehwan Kim, 2006 &lt;/br&gt;
The authors address the construction and management of a portfolio using quantitative methods. Among other things, they offer explanations of factor models and the prediction of premiums and exposures.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="general-stock-trading"&gt;General Stock Trading
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/891835.Beating_the_Street" target="_blank" rel="noopener"
&gt;Beating the Street&lt;/a&gt; - Peter Lynch, 1992 &lt;/br&gt;
In the book, Lynch, a successful fund manager from 1977 to 1990, gives readers insight into his investment methods and tactics.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/34889127-charlie-munger" target="_blank" rel="noopener"
&gt;Charlie Munger: The Complete Investor&lt;/a&gt; - Tren Griffin, 2015 &lt;/br&gt;
Tren Griffin uses interviews, writings, and letters to explain the investment philosophy and thought processes of Charlie Munger, vice chairman of Berkshire Hathaway and longtime business partner of Warren Buffett.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/966769.Market_Wizards" target="_blank" rel="noopener"
&gt;Market Wizards: Interviews with Top Traders&lt;/a&gt; - Jack D. Schwager, 1989 &lt;/br&gt;
By interviewing successful investors such as Bruce Kovner, Richard Dennis, Paul Tudor Jones, Michel Steinhardt, Ed Seykota, Marty Schwartz, Tom Baldwin, and others, Schwager explores what separates the world&amp;rsquo;s best traders from the vast majority of unsuccessful investors.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/762462.One_Up_On_Wall_Street" target="_blank" rel="noopener"
&gt;One Up On Wall Street&lt;/a&gt; - Peter Lynch, 1989 &lt;/br&gt;
Peter Lynch, who managed Fidelity Investment&amp;rsquo;s successful Magellan Fund from 1977 to 1990, gives investors an insight into his investment methods. Using simple examples and some practices, he explains his process of stock selection.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/965633.Stocks_for_the_Long_Run_" target="_blank" rel="noopener"
&gt;Stocks for the Long Run&lt;/a&gt; - Jeremy Siegel, 1994 &lt;/br&gt;
The book by Siegel, a finance professor, addresses how to build a balanced portfolio and explains how investors can avoid typical mistakes.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/350675.The_Battle_for_Investment_Survival_" target="_blank" rel="noopener"
&gt;The Battle for Investment Survival&lt;/a&gt; - Gerald M. Loeb, 1911 &lt;/br&gt;
Now more than 100 years old, Gerald M. Loeb&amp;rsquo;s work is considered a classic of financial literature. In his explanations, Loeb straightforwardly explains how investors should behave in rising and falling markets.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/33026972-the-complete-turtle-trader" target="_blank" rel="noopener"
&gt;The Complete Turtle Trader&lt;/a&gt; - Michael W. Covel, 2007 &lt;/br&gt;
The author tells the story of Richard Dennis, an extraordinarily successful stock market trader of the 1980s, and how he turned a group of beginners, under his guidance, into extraordinarily successful traders.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/128947547-the-craft-of-investing-by-john-train" target="_blank" rel="noopener"
&gt;The Craft of Investing&lt;/a&gt; - John Train, 1994 &lt;/br&gt;
In the book, Train outlines his key strategies and principles that have brought him success, addressing everything from the psychology of the market to practical portfolio management tips.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/16235023-the-little-book-of-trading" target="_blank" rel="noopener"
&gt;The Little Book of Trading&lt;/a&gt; - Michael W. Covel, 2011 &lt;/br&gt;
Michael W. Covel&amp;rsquo;s book offers insights into the rules and philosophies used by successful traders. Drawing on the author&amp;rsquo;s own trading experience and the wisdom of other traders, the book offers advice in a direct and easy-to-understand manner.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/8247775-the-little-book-that-still-beats-the-market" target="_blank" rel="noopener"
&gt;The Little Book that Still Beats the Market&lt;/a&gt; - Joel Greenblatt, 2007 &lt;/br&gt;
Joel Greenblatt explains how investors can outperform the popular market averages by systematically applying a formula. The book is kept simple and is aimed at beginners.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/27224350-new-market-wizards" target="_blank" rel="noopener"
&gt;The New Market Wizards&lt;/a&gt; - Jack D. Schwager, 1992 &lt;/br&gt;
Jack Schwager interviews some of the most successful stock traders in the United States. Through these interviews, Schwager offers insight into the strategies, perspectives, and psychological insights of successful traders.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/34943907-think-trade-like-a-champion" target="_blank" rel="noopener"
&gt;Think &amp;amp; Trade Like a Champion&lt;/a&gt; - Mark Minervini, 2017 &lt;/br&gt;
In this book, Mark Minervini explains readers how to apply his methods step by step to enhance their trading performance and create the confidence they need to outperform.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="trend-following"&gt;Trend Following
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/12664470-investing-with-volume-analysis" target="_blank" rel="noopener"
&gt;Investing with Volume Analysis&lt;/a&gt; - Buff Dormeier, 2011 &lt;/br&gt;
Dormeier offers insights into using volume metrics to enhance stock trading strategies, providing a framework for interpreting price-volume relationships to predict market trends.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/25740068-learn-to-trade-momentum-stocks" target="_blank" rel="noopener"
&gt;Learn to Trade Momentum Stocks&lt;/a&gt; - Matthew R. Kratter, 2015 &lt;/br&gt;
This beginner-friendly book presents a trading strategy by Matthew R. Kratter. It is designed to give readers the knowledge and skills to make profitable trades in momentum stocks.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/25819574-stocks-on-the-move" target="_blank" rel="noopener"
&gt;Stocks on the Move&lt;/a&gt; - Andreas Clenow, 2015 &lt;/br&gt;
In this book, Clenow, a hedge fund manager, explores why most mutual funds consistently underperform and shows how anyone can outperform them. He emphasizes the power of momentum investing as one of the few consistent ways to beat the markets.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/20428445-trading-the-trends" target="_blank" rel="noopener"
&gt;Trading the Trends&lt;/a&gt; - L. A. Little, 2011 &lt;/br&gt;
The book covers various aspects, including identifying trends, using technical indicators, and managing risk. The content is written in an easy-to-understand style, providing readers with valuable insights into understanding and profiting from the stock market.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/34855405-trend-following" target="_blank" rel="noopener"
&gt;Trend Following&lt;/a&gt; - Michael W. Covel, 2004 &lt;/br&gt;
Michael W. Covel&amp;rsquo;s book explains trend following without explicitly addressing specific strategies and provides insights on how to use trend following in market situations, whether bull or bear markets.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/18969247-trend-qualification-and-trading" target="_blank" rel="noopener"
&gt;Trend Qualification and Trading&lt;/a&gt; - L. A. Little, 2011 &lt;/br&gt;
Through a proven technical approach, the book explains how to gauge the likelihood of trend continuation and its potential for better trading results. Readers will gain understanding on timing entries, taking profits, and effectively exiting trades based on these trends.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/19569996-trend-trading-set-ups" target="_blank" rel="noopener"
&gt;Trend Trading Set-Ups&lt;/a&gt; - L. A. Little, 2012 &lt;/br&gt;
Building on the neoclassical concept, Little presents traders and investors with a robust methodology to discover promising trade setups and achieve precise timing for trade entry.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="price-action-trading"&gt;Price Action Trading
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/29460388-price-action-breakdown" target="_blank" rel="noopener"
&gt;Price Action Breakdown&lt;/a&gt; - Laurentiu Damir, 2016 &lt;/br&gt;
This book provides a comprehensive guide to trading pure price action analysis. It covers concepts, ideas and trading methods based on pure price action and can be applied to various financial markets.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/55854507-price-action-trading-secrets" target="_blank" rel="noopener"
&gt;Price Action Trading Secrets&lt;/a&gt; - Rayner Teo, 2020 &lt;/br&gt;
Rayner Teo&amp;rsquo;s book is intended as a guide to the use of price action trading. The book covers trading strategies, instruments and techniques and is written in a simple, step-by-step manner.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/19138622-pring-on-price-patterns" target="_blank" rel="noopener"
&gt;Pring on Price Patterns&lt;/a&gt; - Martin J. Pring, 2009 &lt;/br&gt;
Martin J. Pring&amp;rsquo;s book provides a comprehensive examination of the most commonly used price patterns and offers insights into their effectiveness and logic. The book covers a range of patterns, including one- and two-bar patterns, outside bars, reversals, pennants, and more.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/38296614-stock-trading-investing-using-volume-price-analysis" target="_blank" rel="noopener"
&gt;Stock Trading &amp;amp; Investing Using Volume Price Analysis&lt;/a&gt; - Anna Coulling, 2015 &lt;/br&gt;
Anna Coulling provides an in-depth examination of volume price analysis in stock trading. In doing so, she examines the approaches of other successful price analysis practitioners and explains them with examples.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/19169498-trading-price-action-trends" target="_blank" rel="noopener"
&gt;Trading Price Action Trends&lt;/a&gt; - Al Brooks, 2011 &lt;/br&gt;
Al Brooks&amp;rsquo; book is intended as a practical guide to profiting from institutional trading trends. The book breaks down Brooks&amp;rsquo; trading system into its essential components such as institutional piggybacking or trend trading.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="behavioral-finance-and-psychological-aspect-of-investing"&gt;Behavioral Finance and Psychological Aspect of Investing
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/100132.Irrational_Exuberance" target="_blank" rel="noopener"
&gt;Irrational Exuberance&lt;/a&gt; - Robert J. Shiller, 2000 &lt;/br&gt;
Robert J. Shiller&amp;rsquo;s book addresses the psychological and behavioral factors that influence financial markets. It explores the concept of speculative bubbles and irrational exuberance, in which exuberant investor behavior leads to overvaluation of assets.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/6922765-the-little-book-of-behavioral-investing" target="_blank" rel="noopener"
&gt;The Little Book of Behavioral Investing: How not to be your own worst enemy&lt;/a&gt; - James Montier, 2010 &lt;/br&gt;
Author James Montier looks at the psychological aspects of investing and examines common behavioral biases that can hinder investors&amp;rsquo; success in the marketplace.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/68143.The_Wisdom_of_Crowds" target="_blank" rel="noopener"
&gt;The Wisdom of Crowds&lt;/a&gt; - James Surowiecki, 2004 &lt;/br&gt;
James Surowiecki&amp;rsquo;s book addresses the concept that large groups of people collectively have higher intelligence than individual experts. Surowiecki supports his argument with various case studies and anecdotes from different fields.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="risk-and-uncertainty"&gt;Risk and Uncertainty
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/128429.Against_the_Gods" target="_blank" rel="noopener"
&gt;Against the Gods&lt;/a&gt; - Peter L. Bernstein, 1996 &lt;/br&gt;
Bernstein takes the reader on a journey through time, showing how societies throughout history have dealt with uncertainty and developed methods for measuring and managing risk. The author shows the profound impact of risk and probability on human decision making and the development of modern finance.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://en.wikipedia.org/wiki/Fooled_by_Randomness" target="_blank" rel="noopener"
&gt;Fooled by Randomness&lt;/a&gt; - Nassim Nicholas Taleb, 2001 &lt;/br&gt;
Part of Taleb&amp;rsquo;s multi-volume philosophical essay on uncertainty, this book examines various misconceptions of chance, including survival bias and skewed distributions, and illuminates how people tend to seek explanations even when there are none.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/20914691-the-5-mistakes-every-investor-makes-and-how-to-avoid-them" target="_blank" rel="noopener"
&gt;The 5 Mistakes Every Investor Makes and How to Avoid Them&lt;/a&gt; - Peter Mallouk, 2014 &lt;/br&gt;
Mallouk&amp;rsquo;s work discusses the most common mistakes investors make and how to avoid them. Among the most important lessons are the pitfalls of market timing and active trading, which are suboptimal compared to passive strategies.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a class="link" href="https://en.wikipedia.org/wiki/The_Black_Swan:_The_Impact_of_the_Highly_Improbable" target="_blank" rel="noopener"
&gt;The Black Swan&lt;/a&gt; - Nassim Nicholas Taleb, 2007 &lt;/br&gt;
Taleb explores the concept of so-called black swans. These are rare and unpredictable events that have massive consequences and are often rationalized retroactively.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="contemporary-history"&gt;Contemporary History
&lt;/h4&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://www.goodreads.com/book/show/1308591.Dot_con" target="_blank" rel="noopener"
&gt;Dot.con: How America Lost Its Mind and Money in the Internet Era&lt;/a&gt; - John Cassidy, 2002 &lt;/br&gt;
Cassidy chronicles the rise and fall of the dot-com bubble with insight and flair. He introduces the key players and events that shaped the Internet era, from visionary entrepreneurs to ruthless investors.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="most-important-stock-exchanges"&gt;Most Important Stock Exchanges
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;🇦🇺 &lt;a class="link" href="https://www.asx.com.au" target="_blank" rel="noopener"
&gt;Australian Securities Exchange (ASX)&lt;/a&gt; - The largest stock exchange in Australia, known for its strong mining and resource sector.&lt;/li&gt;
&lt;li&gt;🇧🇷 &lt;a class="link" href="https://www.b3.com.br" target="_blank" rel="noopener"
&gt;B3 (Bovespa)&lt;/a&gt; - The largest stock exchange in Latin America and one of the fastest-growing markets globally.&lt;/li&gt;
&lt;li&gt;🇮🇳 &lt;a class="link" href="https://www.bseindia.com" target="_blank" rel="noopener"
&gt;Bombay Stock Exchange (BSE)&lt;/a&gt; - The oldest stock exchange in Asia, founded in 1875, and one of the largest in India.&lt;/li&gt;
&lt;li&gt;🇩🇪 &lt;a class="link" href="https://www.deutsche-boerse.com" target="_blank" rel="noopener"
&gt;Frankfurt Stock Exchange (XETRA)&lt;/a&gt; - The largest stock exchange in Germany owned and operated by Deutsche Börse.&lt;/li&gt;
&lt;li&gt;🇭🇰 &lt;a class="link" href="https://www.hkex.com.hk/" target="_blank" rel="noopener"
&gt;Hong Kong Stock Exchange (HKEX)&lt;/a&gt; - One of the largest stock exchanges in Asia, known for attracting international investors due to its strategic location and listing rules.&lt;/li&gt;
&lt;li&gt;🇰🇷 &lt;a class="link" href="https://www.koreaexchange.org" target="_blank" rel="noopener"
&gt;Korea Exchange (KRX)&lt;/a&gt; - The sole securities exchange operator in South Korea, hosting the KOSPI index.&lt;/li&gt;
&lt;li&gt;🇬🇧 &lt;a class="link" href="https://www.londonstockexchange.com" target="_blank" rel="noopener"
&gt;London Stock Exchange (LSE)&lt;/a&gt; - One of the oldest and most important stock exchanges in the world, with over 3,000 listed companies.&lt;/li&gt;
&lt;li&gt;🇺🇸 &lt;a class="link" href="https://www.nasdaq.com" target="_blank" rel="noopener"
&gt;NASDAQ&lt;/a&gt; - Second largest stock exchange in the world by market capitalization, known for listing technology companies and having a high trading volume.&lt;/li&gt;
&lt;li&gt;🇮🇳 &lt;a class="link" href="https://www.nseindia.com" target="_blank" rel="noopener"
&gt;National Stock Exchange of India (NSE)&lt;/a&gt; - The largest stock exchange in India by market capitalization and trading volume.&lt;/li&gt;
&lt;li&gt;🇺🇸 &lt;a class="link" href="https://www.nyse.com" target="_blank" rel="noopener"
&gt;New York Stock Exchange (NYSE)&lt;/a&gt; - Largest stock exchange in the world by market capitalization, with over 2,800 listed companies.&lt;/li&gt;
&lt;li&gt;🇨🇳 &lt;a class="link" href="https://www.sse.com.cn/" target="_blank" rel="noopener"
&gt;Shanghai Stock Exchange (SSE)&lt;/a&gt; - Fourth largest stock exchange in the world by market capitalization, and the largest in mainland China.&lt;/li&gt;
&lt;li&gt;🇨🇳 &lt;a class="link" href="https://www.szse.cn/" target="_blank" rel="noopener"
&gt;Shenzhen Stock Exchange (SZSE)&lt;/a&gt; - One of the largest stock exchanges in China, known for its focus on technology and growth companies.&lt;/li&gt;
&lt;li&gt;🇸🇬 &lt;a class="link" href="https://www.sgx.com" target="_blank" rel="noopener"
&gt;Singapore Exchange (SGX)&lt;/a&gt; - One of the leading Asian exchanges, known for its regulatory excellence and derivatives market.&lt;/li&gt;
&lt;li&gt;🇨🇭 &lt;a class="link" href="https://www.six-group.com" target="_blank" rel="noopener"
&gt;Swiss Stock Exchange (SIX)&lt;/a&gt; - The principal stock exchange in Switzerland, known for its life sciences and financial services companies.&lt;/li&gt;
&lt;li&gt;🇯🇵 &lt;a class="link" href="https://www.jpx.co.jp/english/" target="_blank" rel="noopener"
&gt;Tokyo Stock Exchange (TSE)&lt;/a&gt; - Third largest stock exchange in the world by market capitalization, and the largest in Asia.&lt;/li&gt;
&lt;li&gt;🇨🇦 &lt;a class="link" href="https://www.tsx.com" target="_blank" rel="noopener"
&gt;Toronto Stock Exchange (TSX)&lt;/a&gt; - The largest stock exchange in Canada and a major global mining and energy hub.&lt;/li&gt;
&lt;li&gt;🇪🇺 &lt;a class="link" href="https://www.euronext.com/en" target="_blank" rel="noopener"
&gt;Euronext&lt;/a&gt; - A pan-European stock exchange operating in several countries, including France, the Netherlands, Belgium, Portugal, and Ireland.&lt;/li&gt;
&lt;/ul&gt;</description></item><item><title>GitHub Awesome Roadmaps</title><link>https://hanguangwu.github.io/blog/en/p/github-awesome-roadmaps/</link><pubDate>Thu, 29 Jan 2026 13:34:25 -0800</pubDate><guid>https://hanguangwu.github.io/blog/en/p/github-awesome-roadmaps/</guid><description>&lt;h1 id="awesome-roadmaps"&gt;Awesome Roadmaps
&lt;/h1&gt;&lt;h2 id="introduction"&gt;Introduction
&lt;/h2&gt;&lt;p&gt;A curated list of roadmaps, mostly about software development, which give you a clear route to improve your knowledge or skills.&lt;/p&gt;
&lt;h2 id="programming-language"&gt;Programming Language
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/salmer/CppDeveloperRoadmap" target="_blank" rel="noopener"
&gt;C++ Developer Roadmap&lt;/a&gt; - Roadmap focuses on general competencies and skills about C++ in 2024 &lt;a class="link" href="https://github.com/salmer/CppDeveloperRoadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2024-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/Alikhll/golang-developer-roadmap" target="_blank" rel="noopener"
&gt;Go Developer Roadmap&lt;/a&gt; - Roadmap to becoming a Go developer in 2021 &lt;a class="link" href="https://github.com/Alikhll/golang-developer-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2021-yellowgreen.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/s4kibs4mi/java-developer-roadmap" target="_blank" rel="noopener"
&gt;Java Developer Roadmap&lt;/a&gt; - Roadmap to becoming a Java developer in 2025 &lt;a class="link" href="https://github.com/s4kibs4mi/java-developer-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2025-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/aliyr/Nodejs-Developer-Roadmap" target="_blank" rel="noopener"
&gt;Nodejs Developer Roadmap&lt;/a&gt; - Roadmap to becoming a Node.js developer in 2021 &lt;a class="link" href="https://github.com/aliyr/Nodejs-Developer-Roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-yellow2021-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/thecodeholic/php-developer-roadmap" target="_blank" rel="noopener"
&gt;PHP Developer roadmap&lt;/a&gt; - Following this path will guarantee to Become a PHP Developer in 2021 &lt;a class="link" href="https://github.com/thecodeholic/php-developer-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2021-green.svg"&gt;&lt;/a&gt; &lt;a class="link" href="https://github.com/thecodeholic/php-developer-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/YouTube-FF0000?logo=youtube"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/anshulrgoyal/rust-web-developer-roadmap" target="_blank" rel="noopener"
&gt;Rust Web Developer Roadmap&lt;/a&gt; - Roadmap to becoming a Rust Web developer in 2022 &lt;a class="link" href="https://github.com/anshulrgoyal/rust-web-developer-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2022-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="web-development"&gt;Web Development
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/sulco/angular-developer-roadmap" target="_blank" rel="noopener"
&gt;Angular Developer Roadmap&lt;/a&gt; - Roadmap to becoming an Angular developer &lt;a class="link" href="https://github.com/sulco/angular-developer-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2018-yellow.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/saifaustcse/angular-developer-roadmap" target="_blank" rel="noopener"
&gt;Angular Developer Roadmap 2&lt;/a&gt; - Roadmap to becoming an Angular developer in 2024 &lt;a class="link" href="https://github.com/saifaustcse/angular-developer-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2024-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/MoienTajik/AspNetCore-Developer-Roadmap" target="_blank" rel="noopener"
&gt;ASP.Net Core Developer Roadmap&lt;/a&gt; - Roadmap to becoming an ASP .NET Core developer in 2025 &lt;a class="link" href="https://github.com/MoienTajik/AspNetCore-Developer-Roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2025-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/kamranahmedse/developer-roadmap" target="_blank" rel="noopener"
&gt;Developer Roadmap&lt;/a&gt; - Community driven roadmaps, articles and resources for developers &lt;a class="link" href="https://github.com/kamranahmedse/developer-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2022-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/sadanandpai/frontend-learning-kit/blob/main/public/2024_FE_roadmap.pdf" target="_blank" rel="noopener"
&gt;Frontend development roadmap&lt;/a&gt; - Frontend development interview checklist &amp;amp; roadmap &lt;a class="link" href="https://github.com/sadanandpai/frontend-learning-kit/blob/main/public/2024_FE_roadmap.pdf" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2024-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/Hasnayeen/laravel-developer-roadmap" target="_blank" rel="noopener"
&gt;Laravel Developer Roadmap&lt;/a&gt; - Roadmap to becoming an Laravel developer in 2024 &lt;a class="link" href="https://github.com/Hasnayeen/laravel-developer-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2024-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/adam-golab/react-developer-roadmap" target="_blank" rel="noopener"
&gt;React Developer Roadmap&lt;/a&gt; - Roadmap to becoming a React developer in 2019 &lt;a class="link" href="https://github.com/adam-golab/react-developer-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2019-yellowgreen.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/flaviocopes/vue-developer-roadmap" target="_blank" rel="noopener"
&gt;Vue Developer Roadmap&lt;/a&gt; - Roadmap to becoming a Vue.js developer in 2019 &lt;a class="link" href="https://github.com/flaviocopes/vue-developer-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2019-yellowgreen.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="mobile-development"&gt;Mobile Development
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/mobile-roadmap/android-developer-roadmap" target="_blank" rel="noopener"
&gt;Android Developer Roadmap&lt;/a&gt; - Roadmap to becoming an Android developer in 2020 &lt;a class="link" href="https://github.com/mobile-roadmap/android-developer-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2020-yellowgreen.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/olexale/flutter_roadmap" target="_blank" rel="noopener"
&gt;Flutter Developer Roadmap&lt;/a&gt; - Roadmap for creating hybrid apps using Google&amp;rsquo;s Flutter SDK &lt;a class="link" href="https://github.com/olexale/flutter_roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2024-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/BohdanOrlov/iOS-Developer-Roadmap" target="_blank" rel="noopener"
&gt;iOS Developer Roadmap&lt;/a&gt; - Roadmap to becoming an iOS developer in 2020 &lt;a class="link" href="https://github.com/BohdanOrlov/iOS-Developer-Roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2020-yellowgreen.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="game-development"&gt;Game Development
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/utilForever/game-developer-roadmap" target="_blank" rel="noopener"
&gt;Game Developer Roadmap&lt;/a&gt; - Roadmap to becoming a game developer in 2022 &lt;a class="link" href="https://github.com/utilForever/game-developer-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2022-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/miloyip/game-programmer" target="_blank" rel="noopener"
&gt;Game Programming Roadmap&lt;/a&gt; - Roadmap to becoming a game programmer &lt;a class="link" href="https://github.com/miloyip/game-programmer" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2019-yellow.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="ai--machine-learning--data-science"&gt;AI / Machine Learning / Data Science
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/AMAI-GmbH/AI-Expert-Roadmap" target="_blank" rel="noopener"
&gt;AI Expert Roadmap&lt;/a&gt; - Roadmap to becoming an Artificial Intelligence Expert in 2022 &lt;a class="link" href="https://github.com/AMAI-GmbH/AI-Expert-Roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2022-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/floodsung/Deep-Learning-Papers-Reading-Roadmap" target="_blank" rel="noopener"
&gt;Deep Learning Reading Roadmap&lt;/a&gt; - Roadmap through seminal deep learning papers &lt;a class="link" href="https://github.com/floodsung/Deep-Learning-Papers-Reading-Roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2022-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/instillai/deep-learning-roadmap" target="_blank" rel="noopener"
&gt;Deep Learning Roadmap&lt;/a&gt; - Roadmap to getting started with deep learning &lt;a class="link" href="https://github.com/instillai/deep-learning-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2020-yellowgreen.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/datastacktv/data-engineer-roadmap" target="_blank" rel="noopener"
&gt;Data Engineer Roadmap&lt;/a&gt; - Roadmap to becoming a data engineer in 2021 &lt;a class="link" href="https://github.com/datastacktv/data-engineer-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2021-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/boringPpl/data-science-roadmap" target="_blank" rel="noopener"
&gt;Data Science Roadmap&lt;/a&gt; - Roadmap to becoming a data scientist &lt;a class="link" href="https://github.com/boringPpl/data-science-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2020-yellow.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/MrMimic/data-scientist-roadmap" target="_blank" rel="noopener"
&gt;Data Scientist Roadmap&lt;/a&gt; - Roadmap of tutorials for those interested in data science &lt;a class="link" href="https://github.com/MrMimic/data-scientist-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2024-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/graykode/nlp-roadmap" target="_blank" rel="noopener"
&gt;NLP Roadmap&lt;/a&gt; - Roadmap for Natural Language Processing learning in 2019 &lt;a class="link" href="https://github.com/graykode/nlp-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2019-yellowgreen.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="miscellaneous"&gt;Miscellaneous
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/fityanos/awesome-quality-assurance-roadmap" target="_blank" rel="noopener"
&gt;Awesome Quality Assurance Roadmap&lt;/a&gt; - Roadmap for QA and software testing learning curve which you might need to start the journey &lt;a class="link" href="https://github.com/fityanos/awesome-quality-assurance-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2021-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/Sundowndev/hacker-roadmap" target="_blank" rel="noopener"
&gt;Hacker Roadmap&lt;/a&gt; - Roadmap for amateur pen testers and a collection of hacking tools, resources and references &lt;a class="link" href="https://github.com/Sundowndev/hacker-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2023-yellow.svg"&gt;&lt;/a&gt;❗.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/AlaaAttya/software-architect-roadmap" target="_blank" rel="noopener"
&gt;Software Architect Roadmap&lt;/a&gt; - Roadmap for becoming a software architect &lt;a class="link" href="https://github.com/AlaaAttya/software-architect-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2018-yellow.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/stemmlerjs/software-design-and-architecture-roadmap" target="_blank" rel="noopener"
&gt;Software Design and Architecture Roadmap&lt;/a&gt; - A software design and architecture roadmap for any developer &lt;a class="link" href="https://github.com/stemmlerjs/software-design-and-architecture-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2019-yellow.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/mohsenshafiei/system-design-master-plan" target="_blank" rel="noopener"
&gt;System Design Roadmap&lt;/a&gt; - Roadmap to learn system design and architecture &lt;a class="link" href="https://github.com/mohsenshafiei/system-design-master-plan" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2024-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/togiberlin/ui-ux-designer-roadmap" target="_blank" rel="noopener"
&gt;UI/UX Designer Roadmap&lt;/a&gt; - Roadmap on becoming a UI/UX designer in 2017 &lt;a class="link" href="https://github.com/togiberlin/ui-ux-designer-roadmap" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2017-yellow.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/IlIllII/collecobrary" target="_blank" rel="noopener"
&gt;University Degree Roadmap&lt;/a&gt; - Roadmap for taking online university courses in various degree subjects &lt;a class="link" href="https://github.com/IlIllII/collecobrary" target="_blank" rel="noopener"
&gt;&lt;img src="https://img.shields.io/badge/Roadmap-2024-green.svg"&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="articles"&gt;Articles
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://medium.com/mindorks/a-roadmap-to-become-a-better-android-developer-3038cf7f8c8d" target="_blank" rel="noopener"
&gt;A Roadmap To Become A Better Android Developer&lt;/a&gt; - A collection of articles to provide a proper roadmap to become a better Android Developer &lt;img src="https://img.shields.io/badge/Medium-000000?logo=medium"&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.scaler.com/blog/java-full-stack-developer-roadmap/" target="_blank" rel="noopener"
&gt;Java Full Stack Developer Roadmap&lt;/a&gt; - Discover all the tech career roadmaps and latest market trends in the tech job market &lt;img src="https://img.shields.io/badge/Java-0000FF"&gt;!&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://www.appliedaicourse.com/blog/data-analytics-roadmap/" target="_blank" rel="noopener"
&gt;Data Analytics Roadmap&lt;/a&gt; - Unlock success with this comprehensive roadmap: your guide to mastering analytics skills and career growth!&lt;/li&gt;
&lt;/ul&gt;</description></item><item><title>GitHub Repo Awesome Public Datasets</title><link>https://hanguangwu.github.io/blog/en/p/github-repo-awesome-public-datasets/</link><pubDate>Wed, 31 Dec 2025 13:34:25 -0800</pubDate><guid>https://hanguangwu.github.io/blog/en/p/github-repo-awesome-public-datasets/</guid><description>&lt;h1 id="awesome-public-datasets"&gt;Awesome Public Datasets
&lt;/h1&gt;&lt;p&gt;This is a list of &lt;a class="link" href="https://github.com/awesomedata/awesome-public-datasets" target="_blank" rel="noopener"
&gt;topic-centric public data sources&lt;/a&gt; in high quality. They are collected and tidied from blogs, answers, and user responses. Most of the data sets listed below are free, however, some are not. This project was incubated at
&lt;a class="link" href="https://github.com/OMNILab" target="_blank" rel="noopener"
&gt;OMNILab&lt;/a&gt;, Shanghai Jiao Tong University
during Xiaming Chen's Ph.D. studies. OMNILab is now part of the
&lt;a class="link" href="https://github.com/Bai-Yu-Lan" target="_blank" rel="noopener"
&gt;BaiYuLan Open AI community&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Other
amazingly awesome lists can be found in &lt;a class="link" href="https://github.com/sindresorhus/awesome" target="_blank" rel="noopener"
&gt;sindresorhus's
awesome&lt;/a&gt; list.&lt;/p&gt;
&lt;h2 id="agriculture"&gt;Agriculture
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://doi.pangaea.de/10.1594/PANGAEA.909132" target="_blank" rel="noopener"
&gt;The global dataset of historical yields for major crops
1981&amp;ndash;2016 - The Global Dataset of
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Agriculture/Global-dataset-of-historical-yields-for-major-crops.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://doi.org/10.5281/zenodo.1227837" target="_blank" rel="noopener"
&gt;Hyperspectral benchmark dataset on soil moisture - This dataset was
measured in a five-day
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Agriculture/Hyperspectral-Benchmark-Dataset-On-Soil-Moisture.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/softwaremill/lemon-dataset" target="_blank" rel="noopener"
&gt;Lemons quality control dataset - Lemon dataset has been prepared to
investigate the
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Agriculture/Lemon-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.indexdatabase.de/db/i-single.php?id=63" target="_blank" rel="noopener"
&gt;Optimized Soil Adjusted Vegetation Index - The IDB is a tool for
working with remote sensing
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Agriculture/Optimized%20Soil%20Adjusted%20Vegetation%20Index)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://fdc.nal.usda.gov/download-datasets" target="_blank" rel="noopener"
&gt;U.S. Department of Agriculture's Nutrient Database - USDA National
Nutrient Database for
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Agriculture/U.S.-Department-of-Agricultures-Nutrient-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://plants.usda.gov/downloads" target="_blank" rel="noopener"
&gt;U.S. Department of Agriculture's PLANTS Database - The Complete
PLANTS Checklist is nearly 7
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Agriculture/U.S.-Department-of-Agricultures-PLANTS-Database.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="architecture"&gt;Architecture
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://zenodo.org/record/7070952#.Y0mACy0RqO0" target="_blank" rel="noopener"
&gt;Swiss Apartment Models - This dataset contains detailed data on
42,207 apartments (242,257
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Architecture/appartment-models.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="biology"&gt;Biology
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.internationalgenome.org/data" target="_blank" rel="noopener"
&gt;1000 Genomes - The 1000 Genomes Project ran between 2008 and 2015,
creating the largest
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/1000-Genomes.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://anhir.grand-challenge.org/" target="_blank" rel="noopener"
&gt;ANHIR - Automatic Non-rigid Histological Image Registration (ANHIR)
consists of 2D \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/ANHIR.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/biocore/American-Gut" target="_blank" rel="noopener"
&gt;American Gut (Microbiome Project) - The American Gut project is the
largest crowdsourced
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/American-Gut-Microbiome-Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://bupt-ai-cz.github.io/BCNB/" target="_blank" rel="noopener"
&gt;BCNB - There are WSIs of 1058 patients, part of tumor regions are
annotated in WSIs. Except
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/BCNB.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.broadinstitute.org/bbbc" target="_blank" rel="noopener"
&gt;Broad Bioimage Benchmark Collection (BBBC) - The Broad Bioimage
Benchmark Collection (BBBC)
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Broad-Bioimage-Benchmark-Collection-BBBC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.broadinstitute.org/ccle/home" target="_blank" rel="noopener"
&gt;Broad Cancer Cell Line Encyclopedia
(CCLE)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Broad-Cancer-Cell-Line-Encyclopedia-CCLE.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://cmp.felk.cvut.cz/~borovji3/?page=dataset" target="_blank" rel="noopener"
&gt;CIMA - CIMA dataset includes images of 2D histological microscopy
tissue slices.&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/CIMA.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.cellimagelibrary.org/home" target="_blank" rel="noopener"
&gt;Cell Image Library - This library is a public and easily accessible
resource database of \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Cell-Image-Library.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://completegenomics.mgiamericas.com/demodata" target="_blank" rel="noopener"
&gt;Complete Genomics Public Data - A diverse data set of whole human
genomes are freely
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Complete-Genomics-Public-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.kaggle.com/stanleyhua/cytoimagenet" target="_blank" rel="noopener"
&gt;CytoImageNet - A large-scale dataset of microscopy images. Contains
890,737 total grayscale
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/CytoImageNet.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.ebi.ac.uk/arrayexpress/" target="_blank" rel="noopener"
&gt;EBI ArrayExpress - ArrayExpress Archive of Functional Genomics Data
stores data from high- \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/EBI-ArrayExpress.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.ebi.ac.uk/emdb/" target="_blank" rel="noopener"
&gt;EBI Protein Data Bank in Europe - The Electron Microscopy Data Bank
(EMDB) is a public \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/EBI-Protein-Data-Bank-in-Europe.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.encodeproject.org" target="_blank" rel="noopener"
&gt;ENCODE project - The Encyclopedia of DNA Elements (ENCODE)
Consortium is an ongoing \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/ENCODE-project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.ebi.ac.uk/pdbe/emdb/empiar/" target="_blank" rel="noopener"
&gt;Electron Microscopy Pilot Image Archive (EMPIAR) - EMPIAR, the
Electron Microscopy Public
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Electron-Microscopy-Pilot-Image-Archive-EMPIAR.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://ensemblgenomes.org/" target="_blank" rel="noopener"
&gt;Ensembl Genomes&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Ensembl-Genomes.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.ncbi.nlm.nih.gov/geo/" target="_blank" rel="noopener"
&gt;Gene Expression Omnibus (GEO) - GEO is a public functional genomics
data repository \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Gene-Expression-Omnibus-GEO.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://geneontology.org/docs/download-go-annotations/" target="_blank" rel="noopener"
&gt;Gene Ontology (GO) - GO annotation
files&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Gene-Ontology-GO.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/jhpoelen/eol-globi-data/wiki#accessing-species-interaction-data" target="_blank" rel="noopener"
&gt;Global Biotic Interactions
(GloBI)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Global-Biotic-Interactions-GloBI.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://lincs.hms.harvard.edu" target="_blank" rel="noopener"
&gt;Harvard Medical School (HMS) LINCS Project - The Harvard Medical
School (HMS) LINCS Center is \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Harvard-Medical-School-LINCS-Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.hagsc.org/hgdp/files.html" target="_blank" rel="noopener"
&gt;Human Genome Diversity Project - A group of scientists at Stanford
University have \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Human-Genome-Diversity-Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.hmpdacc.org/reference_genomes/reference_genomes.php" target="_blank" rel="noopener"
&gt;Human Microbiome Project (HMP) - The HMP sequenced over 2000
reference genomes isolated from
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Human-Microbiome-Project-HMP.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://ico2s.org/datasets/psp_benchmark.html" target="_blank" rel="noopener"
&gt;ICOS PSP Benchmark - The ICOS PSP benchmarks repository contains an
adjustable real-world
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/ICOS-PSP-Benchmark.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://hapmap.ncbi.nlm.nih.gov/downloads/index.html.en" target="_blank" rel="noopener"
&gt;International HapMap
Project&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/International-HapMap-Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.ebi.ac.uk/biostudies/JCB/studies" target="_blank" rel="noopener"
&gt;Journal of Cell Biology DataViewer - All JCB data was moved to
Biostudies&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Journal-of-Cell-Biology-DataViewer.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.genome.jp/kegg/" target="_blank" rel="noopener"
&gt;KEGG - KEGG is a database resource for understanding high-level
functions and utilities of \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/KEGG.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.ncbi.nlm.nih.gov/guide/proteins/#databases" target="_blank" rel="noopener"
&gt;NCBI
Proteins&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/NCBI-Proteins.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.ncbi.nlm.nih.gov/taxonomy" target="_blank" rel="noopener"
&gt;NCBI Taxonomy - The NCBI Taxonomy database is a curated set of
names and classifications for
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/NCBI-Taxonomy.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://gdc.cancer.gov/access-data/gdc-data-portal" target="_blank" rel="noopener"
&gt;NCI Genomic Data Commons - The GDC Data Portal is a robust
data-driven platform that allows
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/NCI-Genomic-Data-Commons.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="ftp://ftp.ncbi.nih.gov/pub/geo/DATA/supplementary/series/GSE6532/" &gt;NIH Microarray
data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/NIH-Microarray-data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://opensnp.org/" target="_blank" rel="noopener"
&gt;OpenSNP genotypes data - openSNP allows customers of
direct-to-customer genetic tests to \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/OpenSNP-genotypes-data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://allisonhorst.github.io/palmerpenguins/" target="_blank" rel="noopener"
&gt;Palmer Penguins - The goal of palmerpenguins is to provide a great
dataset for data
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Palmer-Penguins.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.pathguide.org/" target="_blank" rel="noopener"
&gt;Pathguid - Protein-Protein Interactions
Catalog&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Pathguid.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.rcsb.org/" target="_blank" rel="noopener"
&gt;Protein Data Bank - This resource is powered by the Protein Data
Bank archive-information \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Protein-Data-Bank.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.med.unc.edu/pgc/downloads" target="_blank" rel="noopener"
&gt;Psychiatric Genomics Consortium - The purpose of the Psychiatric
Genomics Consortium (PGC) is
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Psychiatric-Genomics-Consortium.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://pubchem.ncbi.nlm.nih.gov/" target="_blank" rel="noopener"
&gt;PubChem Project - PubChem is the world's largest collection of
freely accessible chemical
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/PubChem-Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.coremine.com/" target="_blank" rel="noopener"
&gt;PubGene (now Coremine Medical) - COREMINE™ is a family of tools
developed by the Norwegian \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/PubGene-now-Coremine-Medical.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://cancer.sanger.ac.uk/cosmic" target="_blank" rel="noopener"
&gt;Sanger Catalogue of Somatic Mutations in Cancer (COSMIC) - COSMIC,
the Catalogue Of Somatic
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Sanger-Catalogue-of-Somatic-Mutations-in-Cancer-COSMIC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cancerrxgene.org/" target="_blank" rel="noopener"
&gt;Sanger Genomics of Drug Sensitivity in Cancer Project
(GDSC)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Sanger-Genomics-of-Drug-Sensitivity-in-Cancer-Project-GDSC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.ncbi.nlm.nih.gov/Traces/sra/" target="_blank" rel="noopener"
&gt;Sequence Read Archive(SRA) - The Sequence Read Archive (SRA) stores
raw sequence data from
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Sequence-Read-ArchiveSRA.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/ababaian/serratus/wiki/Access-Data-Release" target="_blank" rel="noopener"
&gt;Serratus - Analysis of 7.1 million RNA/DNA sequencing datasets to
discover the total
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Serratus-Open-Virome.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://smd.princeton.edu/" target="_blank" rel="noopener"
&gt;Stanford Microarray Data (Retired NOW)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Stanford-Microarray-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.stowers.org/research/publications/odr" target="_blank" rel="noopener"
&gt;Stowers Institute Original Data
Repository&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Stowers-Institute-Original-Data-Repository.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://ssbd.qbic.riken.jp" target="_blank" rel="noopener"
&gt;Systems Science of Biological Dynamics (SSBD) Database - Systems
Science of Biological \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Systems-Science-of-Biological-Dynamics-SSBD-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://gdac.broadinstitute.org/" target="_blank" rel="noopener"
&gt;The Cancer Genome Atlas (TCGA), available via Broad
GDAC&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/The-Cancer-Genome-Atlas-TCGA-available-via-Broad-GDAC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.catalogueoflife.org/data/download" target="_blank" rel="noopener"
&gt;The Catalogue of Life - The Catalogue of Life is a quality-assured
checklist of more than 1.8
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/The-Catalogue-of-Life.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.personalgenomes.org/" target="_blank" rel="noopener"
&gt;The Personal Genome Project - The Personal Genome Project,
initiated in 2005, is a vision and
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/The-Personal-Genome-Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://hgdownload.soe.ucsc.edu/downloads.html" target="_blank" rel="noopener"
&gt;UCSC Public Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/UCSC-Public-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://ftp.ncbi.nlm.nih.gov/repository/UniGene/" target="_blank" rel="noopener"
&gt;UniGene&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/UniGene.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.uniprot.org/downloads" target="_blank" rel="noopener"
&gt;Universal Protein Resource (UnitProt) - The Universal Protein
Resource (UniProt) is a \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/Universal-Protein-Resource.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://docs.rfam.org/en/latest/database.html" target="_blank" rel="noopener"
&gt;Rfam - The Rfam database is a collection of RNA families, each
represented by multiple
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Biology/rfam.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="chemistry"&gt;Chemistry
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://ilthermo.boulder.nist.gov" target="_blank" rel="noopener"
&gt;Ionic Liquids Database -
ILThermo&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Chemistry/ionicliquids.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="climateweather"&gt;Climate+Weather
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://actuariesclimateindex.org/data/" target="_blank" rel="noopener"
&gt;Actuaries Climate Index&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/Actuaries-Climate-Index.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.bom.gov.au/climate/data/index.shtml??zoom=1&amp;amp;lat=-26.9635&amp;amp;lon=133.4635&amp;amp;layers=B0000TFFFFFFFFFTFFFFFFFFFFFFFFFFTTT&amp;amp;dp=IDC10001&amp;amp;p_nccObsCode=201&amp;amp;p_display_type=dailyDataFile" target="_blank" rel="noopener"
&gt;Australian Weather - Updated webpage for Australian Weather
data.&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/Australian-Weather.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://aviationweather.gov/adds/dataserver" target="_blank" rel="noopener"
&gt;Aviation Weather Center - Consistent, timely and accurate weather
information for the world
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/Aviation-Weather-Center.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://sinda.crn.inpe.br/PCD/SITE/novo/site/historico/index.php" target="_blank" rel="noopener"
&gt;Brazilian Weather - Historical data (In Portuguese) - Data related
to climate and weather
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/Brazilian-Weather.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://cds.climate.copernicus.eu/cdsapp#!/home" target="_blank" rel="noopener"
&gt;Several Climate Datasets - The C3S Climate Data Store (CDS) is a
one-stop shop for
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/CDS.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://weather.gc.ca/grib/index_e.html" target="_blank" rel="noopener"
&gt;Canadian Meteorological
Centre&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/Canadian-Meteorological-Centre.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://zenodo.org/record/7540792#.Y-QQGK3MKUn" target="_blank" rel="noopener"
&gt;Caravan - a dataset for large-sample hydrology - Caravan is an open
community dataset of
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/Caravan.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.uea.ac.uk/web/groups-and-centres/climatic-research-unit/data" target="_blank" rel="noopener"
&gt;Climate Data from UEA (updated
monthly)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/Climate-Data-from-UEA-updated-monthly.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.knmi.nl/datasets" target="_blank" rel="noopener"
&gt;Dutch Weather - The KNMI Data Center (KDC) portal provides access
to KNMI data on weather, \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/Dutch-Weather.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.ecad.eu/" target="_blank" rel="noopener"
&gt;European Climate Assessment &amp;amp; Dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/European-Climate-Assessment-&amp;-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://cdc.dwd.de/portal/" target="_blank" rel="noopener"
&gt;German Climate Data Center&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/German-Meteorological-Service-CDC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://en.tutiempo.net/climate" target="_blank" rel="noopener"
&gt;Global Climate Data Since 1929&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/Global-Climate-Data-Since-1929.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://blog.gdeltproject.org/four-massive-datasets-charting-the-global-climate-change-news-narrative-2009-2020/" target="_blank" rel="noopener"
&gt;Charting The Global Climate Change News Narrative 2009-2020 - These
four datasets represent
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/GlobalClimateChangeNewsNarrative2009-2020.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://wiki.earthdata.nasa.gov/display/GIBS" target="_blank" rel="noopener"
&gt;NASA Global Imagery Browse
Services&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/NASA-Global-Imagery-Browse-Services.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.beringclimate.noaa.gov/" target="_blank" rel="noopener"
&gt;NOAA Bering Sea Climate&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/NOAA-Bering-Sea-Climate.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.ncdc.noaa.gov/data-access/quick-links" target="_blank" rel="noopener"
&gt;NOAA Climate
Datasets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/NOAA-Climate-Datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.ncdc.noaa.gov/data-access/model-data/model-datasets/numerical-weather-prediction" target="_blank" rel="noopener"
&gt;NOAA Realtime Weather
Models&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/NOAA-Realtime-Weather-Models.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.esrl.noaa.gov/gmd/grad/stardata.html" target="_blank" rel="noopener"
&gt;NOAA SURFRAD Meteorology and Radiation
Datasets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/NOAA-SURFRAD-Meteorology-and-Radiation-Datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://open-meteo.com" target="_blank" rel="noopener"
&gt;Open-Meteo - Open-Source Weather API - Open-source weather API with
free access for non- \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/Open-Meteo.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.worldbank.org/developers/climate-data-api" target="_blank" rel="noopener"
&gt;The World Bank Open Data Resources for Climate
Change&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/The-World-Bank-Open-Data-Resources-for-Climate-Change.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.cru.uea.ac.uk/data" target="_blank" rel="noopener"
&gt;UEA Climatic Research Unit&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/UEA-Climatic-Research-Unit.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.wunderground.com/history/index.html" target="_blank" rel="noopener"
&gt;WU Historical Weather
Worldwide&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/WU-Historical-Weather-Worldwide.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/washingtonpost/data-2C-beyond-the-limit-usa" target="_blank" rel="noopener"
&gt;Wahington Post Climate Change - To analyze warming temperatures in
the United States, The
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/Washington%20Post%20Climate%20Change.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.worldclim.org" target="_blank" rel="noopener"
&gt;WorldClim - Global Climate Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Climate+Weather/WorldClim.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="complexnetworks"&gt;ComplexNetworks
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://aminer.org/citation" target="_blank" rel="noopener"
&gt;AMiner Citation Network Dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/AMiner-Citation-Network-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://archive.org/details/doi-urls" target="_blank" rel="noopener"
&gt;CrossRef DOI URLs&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/CrossRef-DOI-URLs.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://kdl.cs.umass.edu/display/public/DBLP" target="_blank" rel="noopener"
&gt;DBLP Citation
dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/DBLP-Citation-dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.dis.uniroma1.it/challenge9/download.shtml" target="_blank" rel="noopener"
&gt;DIMACS Road Networks
Collection&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/DIMACS-Road-Networks-Collection.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://nber.org/patents/" target="_blank" rel="noopener"
&gt;NBER Patent Citations&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/NBER-Patent-Citations.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://math.nist.gov/~RPozo/complex_datasets.html" target="_blank" rel="noopener"
&gt;NIST complex networks data
collection&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/NIST-complex-networks-data-collection.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://networkrepository.com/" target="_blank" rel="noopener"
&gt;Network Repository with Interactive Exploratory Analysis
Tools&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/Network-Repository-with-Interactive-Exploratory-Analysis-Tools.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://vlado.fmf.uni-lj.si/pub/networks/data/bio/Yeast/Yeast.htm" target="_blank" rel="noopener"
&gt;Protein-protein interaction
network&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/Protein.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://ogirardot.wordpress.com/2013/01/31/sharing-pypimaven-dependency-data/" target="_blank" rel="noopener"
&gt;PyPI and Maven Dependency
Network&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/PyPI-and-Maven-Dependency-Network.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.elsevier.com/solutions/scopus" target="_blank" rel="noopener"
&gt;Scopus Citation
Database&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/Scopus-Citation-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www-personal.umich.edu/~mejn/netdata/" target="_blank" rel="noopener"
&gt;Small Network Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/Small-Network-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://cs.stanford.edu/~knuth/sgb.html" target="_blank" rel="noopener"
&gt;Stanford GraphBase&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/Stanford-GraphBase-Steven-Skiena.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://snap.stanford.edu/data/" target="_blank" rel="noopener"
&gt;Stanford Large Network Dataset
Collection&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/Stanford-Large-Network-Dataset-Collection.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://stanford.edu/group/sonia/dataSources/index.html" target="_blank" rel="noopener"
&gt;Stanford Longitudinal Network Data
Sources&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/Stanford-Longitudinal-Network-Data-Sources.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://konect.uni-koblenz.de/" target="_blank" rel="noopener"
&gt;The Koblenz Network Collection&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/The-Koblenz-Network-Collection.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://law.di.unimi.it/datasets.php" target="_blank" rel="noopener"
&gt;The Laboratory for Web Algorithmics
(UNIMI)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/The-Laboratory-for-Web-Algorithmics-UNIMI.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://networkdata.ics.uci.edu/resources.php" target="_blank" rel="noopener"
&gt;UCI Network Data
Repository&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/UCI-Network-Data-Repository.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cise.ufl.edu/research/sparse/matrices/" target="_blank" rel="noopener"
&gt;UFL sparse matrix
collection&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/UFL-sparse-matrix-collection.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.eecs.wsu.edu/mgd/gdb.html" target="_blank" rel="noopener"
&gt;WSU Graph Database&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/WSU-Graph-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.crawdad.org/" target="_blank" rel="noopener"
&gt;Community Resource for Archiving Wireless Data At Dartmouth -
Contains datasets of pcap files \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComplexNetworks/crawdad.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="computernetworks"&gt;ComputerNetworks
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.bigdatanews.com/profiles/blogs/big-data-set-3-5-billion-web-pages-made-available-for-all-of-us" target="_blank" rel="noopener"
&gt;3.5B Web Pages from CommonCrawl
2012&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/3.5B-Web-Pages-from-CommonCrawl-2012.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://cnets.indiana.edu/groups/nan/webtraffic/click-dataset/" target="_blank" rel="noopener"
&gt;53.5B Web clicks of 100K users in Indiana
Univ.&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/53.5B-Web-clicks-of-100K-users-in-Indiana-Univ..yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.caida.org/data/overview/" target="_blank" rel="noopener"
&gt;CAIDA Internet Datasets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/CAIDA-Internet-Datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://crawdad.cs.dartmouth.edu/" target="_blank" rel="noopener"
&gt;CRAWDAD Wireless datasets from Dartmouth
Univ.&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/CRAWDAD-Wireless-datasets-from-Dartmouth-Univ..yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://lemurproject.org/clueweb09/" target="_blank" rel="noopener"
&gt;ClueWeb09 - 1B web pages&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/ClueWeb09.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://lemurproject.org/clueweb12/" target="_blank" rel="noopener"
&gt;ClueWeb12 - 733M web pages&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/ClueWeb12.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://commoncrawl.org/the-data/get-started/" target="_blank" rel="noopener"
&gt;CommonCrawl Web Data over 7
years&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/CommonCrawl-Web-Data-over-7-years.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/coveooss/shopper-intent-prediction-nature-2020" target="_blank" rel="noopener"
&gt;Shopper Intent Prediction from Clickstream E‑Commerce Data with
Minimal Browsing
Information&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/Coveo-Shopper-Intent-Prediction.yaml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://labs.criteo.com/2015/03/criteo-releases-its-new-dataset/" target="_blank" rel="noopener"
&gt;Criteo click-through
data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/Criteo-click-through-data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://scans.io/" target="_blank" rel="noopener"
&gt;Internet-Wide Scan Data Repository&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/Internet-Wide-Scan-Data-Repository.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://traffic.comics.unina.it/mirage/" target="_blank" rel="noopener"
&gt;MIRAGE-2019 - MIRAGE-2019 is a human-generated dataset for mobile
traffic analysis with
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/MIRAGE-2019.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.merklemap.com/dns-records-database" target="_blank" rel="noopener"
&gt;Merklemap DNS records Dataset - Contains 4B+ DNS records accross
700 million unique
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/Merklemap-DNS-Records-dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://ooni.torproject.org/data/" target="_blank" rel="noopener"
&gt;OONI: Open Observatory of Network Interference - Internet
censorship data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/OONI-Open-Observatory-of-Network-Interference.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://console.developers.google.com/storage/openmobiledata_public/" target="_blank" rel="noopener"
&gt;Open Mobile Data by
MobiPerf&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/Open-Mobile-Data-by-MobiPerf.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://p2pta.ewi.tudelft.nl/" target="_blank" rel="noopener"
&gt;The Peer-to-Peer Trace Archive - Real-world measurements play a key
role in studying the \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/P2P-Trace-Archive.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://sonar.labs.rapid7.com/" target="_blank" rel="noopener"
&gt;Rapid7 Sonar Internet Scans&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/Rapid7-Sonar-Internet-Scans.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.caida.org/projects/network_telescope/" target="_blank" rel="noopener"
&gt;UCSD Network Telescope, IPv4 /8
net&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ComputerNetworks/UCSD-Network-Telescope-IPv4-slash8-net.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="cybersecurity"&gt;CyberSecurity
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.unb.ca/cic/datasets/andmal2020.html" target="_blank" rel="noopener"
&gt;CCCS-CIC-AndMal-2020 - The dataset includes 200K benign and 200K
malware samples totalling to
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//CyberSecurity/CCCS-CIC-AndMal-2020.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://zenodo.org/record/3746129" target="_blank" rel="noopener"
&gt;Traffic and Log Data Captured During a Cyber Defense Exercise -
This dataset was acquired
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//CyberSecurity/Traffic-and-Log-Data-Captured-During-a-Cyber-Defense-Exercise.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="datachallenges"&gt;DataChallenges
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.aicrowd.com/" target="_blank" rel="noopener"
&gt;AIcrowd Competitions&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/AIcrowd-Competitions.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/duyetdev/bruteforce-database" target="_blank" rel="noopener"
&gt;Bruteforce
Database&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/Bruteforce-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.chalearn.org/" target="_blank" rel="noopener"
&gt;Challenges in Machine Learning&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/Challenges-in-Machine-Learning.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://data.crowdanalytix.com" target="_blank" rel="noopener"
&gt;CrowdANALYTIX dataX&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/CrowdANALYTIX-dataX.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.d4d.orange.com/en/home" target="_blank" rel="noopener"
&gt;D4D Challenge of Orange&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/D4D-Challenge-of-Orange.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.drivendata.org/" target="_blank" rel="noopener"
&gt;DrivenData Competitions for Social
Good&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/DrivenData-Competitions-for-Social-Good.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.icwsm.org/2018/datasets/datasets/#obtaining" target="_blank" rel="noopener"
&gt;ICWSM Data Challenge (since
2009)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/ICWSM-Data-Challenge-since-2009.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.kddcup2012.org/" target="_blank" rel="noopener"
&gt;KDD Cup by Tencent 2012&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/KDD-Cup-by-Tencent-2012.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.kaggle.com/" target="_blank" rel="noopener"
&gt;Kaggle Competition Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/Kaggle-Competition-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/localytics/data-viz-challenge" target="_blank" rel="noopener"
&gt;Localytics Data Visualization
Challenge&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/Localytics-Data-Visualization-Challenge.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.kaggle.com/datasets/netflix-inc/netflix-prize-data" target="_blank" rel="noopener"
&gt;Netflix
Prize&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/Netflix-Prize.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://2015.spaceappschallenge.org" target="_blank" rel="noopener"
&gt;Space Apps Challenge&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/Space-Apps-Challenge.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://dandelion.eu/datamine/open-big-data/" target="_blank" rel="noopener"
&gt;Telecom Italia Big Data
Challenge&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/Telecom-Italia-Big-Data-Challenge.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://travistorrent.testroots.org/" target="_blank" rel="noopener"
&gt;TravisTorrent Dataset - MSR'2017 Mining
Challenge&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/TravisTorrent-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://tunedit.org/challenges/" target="_blank" rel="noopener"
&gt;TunedIT - Data mining &amp;amp; machine learning data sets, algorithms,
challenges&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/TunedIT.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.yelp.com/dataset" target="_blank" rel="noopener"
&gt;Yelp Dataset Challenge - The Yelp dataset is a subset of our
businesses, reviews, and user \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//DataChallenges/Yelp-Dataset-Challenge.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="earthscience"&gt;EarthScience
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/SorourMo/38-Cloud-A-Cloud-Segmentation-Dataset" target="_blank" rel="noopener"
&gt;38-Cloud (Cloud Detection) - Contains 38 Landsat 8 scene images and
their manually extracted
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//EarthScience/38-Cloud.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.fao.org/nr/water/aquastat/data/query/index.html?lang=en" target="_blank" rel="noopener"
&gt;AQUASTAT - Global water resources and
uses&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//EarthScience/AQUASTAT.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.bodc.ac.uk/data/" target="_blank" rel="noopener"
&gt;BODC - marine data of ~22K vars&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//EarthScience/BODC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://sedac.ciesin.columbia.edu/data/sets/browse" target="_blank" rel="noopener"
&gt;EOSDIS - NASA's earth observing system
data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//EarthScience/EOSDIS.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.earthmodels.org/" target="_blank" rel="noopener"
&gt;Earth Models&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//EarthScience/Earth-Models.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://globalwindatlas.info/" target="_blank" rel="noopener"
&gt;Global Wind Atlas - The Global Wind Atlas is a free, web-based
application developed to help
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//EarthScience/Global-Wind-Atlas.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://imos.aodn.org.au" target="_blank" rel="noopener"
&gt;Integrated Marine Observing System (IMOS) - roughly 30TB of ocean
measurements&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//EarthScience/Integrated-Marine-Observing-System-IMOS.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://marinexplore.org/" target="_blank" rel="noopener"
&gt;Marinexplore - Open Oceanographic Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//EarthScience/Marinexplore.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://mymobilebay.com" target="_blank" rel="noopener"
&gt;Alabama Real-Time Coastal Observing System&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//EarthScience/MyMobileBay.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://nerrsdata.org" target="_blank" rel="noopener"
&gt;National Estuarine Research Reserves System-Wide Monitoring
Program - long-term estuarine \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//EarthScience/NERRS-SWMP.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data-ogauthority.opendata.arcgis.com/" target="_blank" rel="noopener"
&gt;Oil and Gas Authority Open Data - The dataset covers 12,500
offshore wellbores, 5,000 seismic
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//EarthScience/Oil-and-Gas-Authority-UK.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://archive.org/details/radiance-geojson-data-2025" target="_blank" rel="noopener"
&gt;Radiance GeoJSON &amp;mdash; Global Light Pollution - Global nighttime
light pollution dataset derived
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//EarthScience/Radiance-GeoJSON-Light-Pollution.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://volcano.si.edu/" target="_blank" rel="noopener"
&gt;Smithsonian Institution Global Volcano and Eruption
Database&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//EarthScience/Smithsonian-Institution-Global-Volcano-and-Eruption-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://earthquake.usgs.gov/earthquakes/search/" target="_blank" rel="noopener"
&gt;USGS Earthquake
Archives&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//EarthScience/USGS-Earthquake-Archives.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.kaggle.com/datasets/robustus/whpa-prediction" target="_blank" rel="noopener"
&gt;Wellhead Protection Area (protection zone) prediction using
breakthrough curves - This
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//EarthScience/WHPA.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="economics"&gt;Economics
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.apo-tokyo.org/wedo/productivity-measurement/" target="_blank" rel="noopener"
&gt;Asian Productivity Organization (APO) - The AEPM provides a graphic
dashboard view of
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/APO.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.aseanstats.org/" target="_blank" rel="noopener"
&gt;ASEAN Stats - The ASEANstatsDataPortal was first launched in
June 2018. The Portal is \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/ASEAN%20Stats.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.aeaweb.org/resources/data" target="_blank" rel="noopener"
&gt;American Economic Association
(AEA)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/American-Economic-Association-AEA.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.asiaklems.net/data/archive.asp" target="_blank" rel="noopener"
&gt;Asian KLEMS - Asia KLEMS is an Asian regional research consortium
to promote building
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/Asian%20KLEMS.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://dataverse.harvard.edu/dataverse/atlas" target="_blank" rel="noopener"
&gt;Harvard Atlas of Economic Complexity - A database for people to
explore global trade flows
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/Atlas%20Economic%20Complexity.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.bis.org/statistics/full_data_sets.htm" target="_blank" rel="noopener"
&gt;BIS Financial Database - The files contain the same data as in the
BIS Statistics Explorer
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/BIS%20Financial%20Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.barrolee.com/" target="_blank" rel="noopener"
&gt;Barro-Lee Education Attainment - Barro-Lee Educational Attainment
Data from 1950 to 2010. \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/Barro%20Lee.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cepii.fr/CEPII/en/bdd_modele/bdd_modele.asp" target="_blank" rel="noopener"
&gt;CEPII Database - A database of the world economy, through its
country and region profiles, in
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/CEPII%20Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://euklems.eu/query/" target="_blank" rel="noopener"
&gt;EUKLEMS - EU KLEMS is an industry level, growth and productivity
research project. EU KLEMS \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/EUKLEMS.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.fraserinstitute.org/economic-freedom/dataset" target="_blank" rel="noopener"
&gt;Economic Freedom of the World
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/Economic-Freedom-of-the-World-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.rug.nl/ggdc/historicaldevelopment/na/" target="_blank" rel="noopener"
&gt;Historical National Accounts - The datahub on Comparative
Historical National Accounts
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/Historical%20National%20Accounts.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.historicalstatistics.org/" target="_blank" rel="noopener"
&gt;Historical MacroEconomic
Statistics&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/Historical-MacroEconomic-Statistics.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://inforumecon.com/" target="_blank" rel="noopener"
&gt;INFORUM - Interindustry Forecasting at the University of
Maryland&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/INFORUM.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://db.nomics.world/" target="_blank" rel="noopener"
&gt;DBnomics &amp;ndash; the world's economic database - Aggregates hundreds of
millions of time series \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/International-Economics-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.worldbank.org/topic/21" target="_blank" rel="noopener"
&gt;International Trade Statistics - The new link contains trade based
on filtered search on the
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/International-Trade-Statistics.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.upcdatabase.com/" target="_blank" rel="noopener"
&gt;Internet Product Code Database&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/Internet-Product-Code-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.jedh.org/" target="_blank" rel="noopener"
&gt;Joint External Debt Data Hub&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/Joint-External-Debt-Data-Hub.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://freit.org/TradeResources/TradeData.php" target="_blank" rel="noopener"
&gt;Jon Haveman International Trade Data
Links&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/Jon-Haveman-International-Trade-Data-Links.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://laklems.net/stats/result" target="_blank" rel="noopener"
&gt;Latin America KLEMS - LAKLEMS is a technical cooperation project
financed by the Inter- \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/LA%20KLEMS.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://longtermproductivity.com/download.html" target="_blank" rel="noopener"
&gt;Long-Term Productivity Database - The Long-Term Productivity
database was created as a
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/Long-Term-Productivity-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.rug.nl/ggdc/historicaldevelopment/maddison/releases/" target="_blank" rel="noopener"
&gt;Maddison Project Database - The Maddison Project Database provides
information on comparative
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/Maddison%20Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://ntaccounts.org/web/nta/show/Browse%20database#H-zfl0oo" target="_blank" rel="noopener"
&gt;National Transfer Accounts - The goal of the National Transfer
Accounts (NTA) project is to
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/NTA.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://opencorporates.com/" target="_blank" rel="noopener"
&gt;OpenCorporates Database of Companies in the
World&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/OpenCorporates-Database-of-Companies-in-the-World.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://ourworldindata.org/" target="_blank" rel="noopener"
&gt;Our World in Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/Our-World-in-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.rug.nl/ggdc/productivity/pwt/?lang=en/" target="_blank" rel="noopener"
&gt;Penn World Table - PWT version 10.0 is a database with information
on relative levels of
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/Penn%20World%20Table.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://econ.sciences-po.fr/thierry-mayer/data" target="_blank" rel="noopener"
&gt;SciencesPo World Trade Gravity
Datasets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/SciencesPo-World-Trade-Gravity-Datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://atlas.cid.harvard.edu" target="_blank" rel="noopener"
&gt;The Atlas of Economic Complexity&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/The-Atlas-of-Economic-Complexity.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://cid.econ.ucdavis.edu" target="_blank" rel="noopener"
&gt;The Center for International Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/The-Center-for-International-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://atlas.media.mit.edu/en/" target="_blank" rel="noopener"
&gt;The Observatory of Economic
Complexity&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/The-Observatory-of-Economic-Complexity.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://comtrade.un.org/data/" target="_blank" rel="noopener"
&gt;UN Commodity Trade Statistics&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/UN-Commodity-Trade-Statistics.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://hdr.undp.org/en" target="_blank" rel="noopener"
&gt;UN Human Development Reports&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/UN-Human-Development-Reports.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.rug.nl/ggdc/valuechain/wiod/" target="_blank" rel="noopener"
&gt;World Input-Output Database - World Input-Output Tables and
underlying data, covering 43
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/World%20Input-Output%20Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.worldklems.net/wkanalytical" target="_blank" rel="noopener"
&gt;World KLEMS - Analytical KLEMS-type data sets for a broad set of
countries around the world.
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Economics/World%20KLEMS.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="education"&gt;Education
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://collegescorecard.ed.gov/data/" target="_blank" rel="noopener"
&gt;College Scorecard Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Education/College-Scorecard-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.nysed.gov/downloads.php" target="_blank" rel="noopener"
&gt;New York State Education Department Data - The New York State
Education Department (NYSED) is
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Education/New-York-State-Education-Department.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.oecd.org/pisa/" target="_blank" rel="noopener"
&gt;Program for International Student Assessement (PISA) - Contains
15-year-old students' \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Education/PISA.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/freeCodeCamp/open-data" target="_blank" rel="noopener"
&gt;Student Data from Free Code
Camp&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Education/Student-Data-from-Free-Code-Camp.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="energy"&gt;Energy
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://ampds.org/" target="_blank" rel="noopener"
&gt;AMPds - The Almanac of Minutely Power dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/AMPds.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://energy.duke.edu/content/building-level-fully-labeled-electricity-disaggregation-blued" target="_blank" rel="noopener"
&gt;BLUEd - Building-Level fUlly labeled Electricity Disaggregation
dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/BLUEd.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://combed.github.io/" target="_blank" rel="noopener"
&gt;COMBED&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/COMBED.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/ECSIM/dbfc-dataset" target="_blank" rel="noopener"
&gt;DBFC - Direct Borohydride Fuel Cell (DBFC)
Dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/DBFC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.datafirst.uct.ac.za/dataportal/index.php/catalog/DELS" target="_blank" rel="noopener"
&gt;DEL - Domestic Electrical Load study datsets for South Africa
(1994 -
2014)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/DEL.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.vs.inf.ethz.ch/res/show.html?what=eco-data" target="_blank" rel="noopener"
&gt;ECO - The ECO data set is a comprehensive data set for
non-intrusive load monitoring and
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/ECO.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.eia.gov/electricity/data/eia923/" target="_blank" rel="noopener"
&gt;EIA&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/EIA.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://datasets.wri.org/dataset/globalpowerplantdatabase" target="_blank" rel="noopener"
&gt;Global Power Plant Database - The Global Power Plant Database is a
comprehensive, open source
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/Global%20Power%20Plant%20Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://randd.defra.gov.uk/Default.aspx?Menu=Menu&amp;amp;Module=More&amp;amp;Location=None&amp;amp;ProjectID=17359&amp;amp;FromSearch=Y&amp;amp;Publisher=1&amp;amp;SearchText=EV0702&amp;amp;SortString=ProjectCode&amp;amp;SortOrder=Asc&amp;amp;Paging=10#Description" target="_blank" rel="noopener"
&gt;HES - Household Electricity Study,
UK&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/HES.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://hfed.github.io/" target="_blank" rel="noopener"
&gt;HFED&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/HFED.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/MOREDataset/MORED" target="_blank" rel="noopener"
&gt;MORED: a Moroccan Buildings&amp;rsquo; Electricity Consumption Dataset -
Since spring of 2019, a data
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/MORED.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.marktstammdatenregister.de/MaStR/Datendownload" target="_blank" rel="noopener"
&gt;Marktstammdatenregister - The German Marktstammdatenregister
(MaStR) is a database of all
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/MaStR.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/ECSIM/pem-dataset1" target="_blank" rel="noopener"
&gt;PEM1 - Proton Exchange Membrane (PEM) Fuel Cell
Dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/PEM1.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://plaidplug.com/" target="_blank" rel="noopener"
&gt;PLAID - The Plug Load Appliance Identification
Dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/PLAID.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/catalyst-cooperative/pudl" target="_blank" rel="noopener"
&gt;The Public Utility Data Liberation Project (PUDL) - PUDL makes US
energy data easier to
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/PUDL.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://redd.csail.mit.edu/" target="_blank" rel="noopener"
&gt;REDD&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/REDD.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.nature.com/articles/s41597-020-0434-6" target="_blank" rel="noopener"
&gt;SYND - A synthetic energy dataset for non-intrusive load
monitoring - With SynD, we present a
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/SYND.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://smda.github.io/smart-meter-data-portal" target="_blank" rel="noopener"
&gt;Smart Meter Data Portal - The Smart Meter Data Portal is part of
the National Science
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/Smart%20Meter%20Data%20Portal.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/areinhardt/tracebase" target="_blank" rel="noopener"
&gt;Tracebase&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/Tracebase.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://ukrstat.gov.ua/operativ/menu/menu_e/energ.htm" target="_blank" rel="noopener"
&gt;Ukraine Energy Centre
Datasets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/UDEC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://jack-kelly.com/data" target="_blank" rel="noopener"
&gt;UK-DALE - UK Domestic Appliance-Level
Electricity&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/UK-DALE.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://nilmworkshop.org/2016/proceedings/Poster_ID18.pdf" target="_blank" rel="noopener"
&gt;WHITED&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/WHITED.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://iawe.github.io/" target="_blank" rel="noopener"
&gt;iAWE&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Energy/iAWE.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="entertainment"&gt;Entertainment
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.kaggle.com/aayushmishra1512/twitchdata" target="_blank" rel="noopener"
&gt;Top Streamers on Twitch - This contains data of Top 1000 Streamers
from past year.&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Entertainment/TwitchStreamersData.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="finance"&gt;Finance
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.bis.org/statistics/full_data_sets.htm" target="_blank" rel="noopener"
&gt;BIS Statistics - BIS statistics, compiled in cooperation with
central banks and other
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Finance/BIS%20Statistics.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/Blockmodo/coin_registry" target="_blank" rel="noopener"
&gt;Blockmodo Coin Registry - A registry of JSON formatted information
files that is primarily
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Finance/Blockmodo-Coin-Registry)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://cfe.cboe.com/market-data/" target="_blank" rel="noopener"
&gt;CBOE Futures Exchange&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Finance/CBOE-Futures-Exchange.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.kaggle.com/aayushmishra1512/faang-complete-stock-data" target="_blank" rel="noopener"
&gt;Complete FAANG Stock data - This data set contains all the stock
data of FAANG companies from
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Finance/FAANG-StockData.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.google.com/finance" target="_blank" rel="noopener"
&gt;Google Finance&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Finance/Google-Finance.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.google.com/trends?q=google&amp;amp;ctab=0&amp;amp;geo=all&amp;amp;date=all&amp;amp;sort=0" target="_blank" rel="noopener"
&gt;Google
Trends&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Finance/Google-Trends.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.nasdaq.com/" target="_blank" rel="noopener"
&gt;NASDAQ&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Finance/NASDAQ.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="ftp://ftp.nyxdata.com/" &gt;NYSE Market Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Finance/NYSE-Market-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.oanda.com/" target="_blank" rel="noopener"
&gt;OANDA&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Finance/OANDA.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://fisher.osu.edu/fin/fdf/osudata.htm" target="_blank" rel="noopener"
&gt;OSU Financial data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Finance/OSU-Financial-data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.quandl.com/" target="_blank" rel="noopener"
&gt;Quandl&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Finance/Quandl.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.sec.gov/edgar/about" target="_blank" rel="noopener"
&gt;SEC EDGAR - EDGAR, the Electronic Data Gathering, Analysis, and
Retrieval system, is the \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Finance/SEC-EDGAR.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://research.stlouisfed.org/fred2/" target="_blank" rel="noopener"
&gt;St Louis Federal&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Finance/St-Louis-Federal.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://finance.yahoo.com/" target="_blank" rel="noopener"
&gt;Yahoo Finance&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Finance/Yahoo-Finance.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="gis"&gt;GIS
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/OloOcki/awesome-citygml" target="_blank" rel="noopener"
&gt;Awesome 3D Semantic City Models - Collection of open 3D semantic
city and region models.&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/3D-Semantic-City-Models.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://opendata.arcgis.com/" target="_blank" rel="noopener"
&gt;ArcGIS Open Data portal&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/ArcGIS-Open-Data-portal.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://cambridgegis.github.io/gisdata.html" target="_blank" rel="noopener"
&gt;Cambridge, MA, US, GIS data on
GitHub&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/Cambridge-MA-US-GIS-data-on-GitHub.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.back4app.com/database/back4app/list-of-all-continents-countries-cities" target="_blank" rel="noopener"
&gt;Database of all continents, countries,
States/Subdivisions/Provinces and Cities - Database
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/Database-of-Continents-Coutries-States-Cities.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://places.factual.com/data/t/places" target="_blank" rel="noopener"
&gt;Factual Global Location
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/Factual-Global-Location-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://dase.grss-ieee.org" target="_blank" rel="noopener"
&gt;IEEE Geoscience and Remote Sensing Society DASE
Website&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/GRSS-DASE-Website.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/simonepri/geo-maps" target="_blank" rel="noopener"
&gt;Geo Maps - High Quality GeoJSON maps programmatically
generated&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/Geo-Maps.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://geodacenter.asu.edu/datalist/" target="_blank" rel="noopener"
&gt;Geo Spatial Data from ASU&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/Geo-Spatial-Data-from-ASU.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://geo-wiki.org/" target="_blank" rel="noopener"
&gt;Geo Wiki Project - Citizen-driven Environmental
Monitoring&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/Geo-Wiki-Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://download.geofabrik.de/" target="_blank" rel="noopener"
&gt;GeoFabrik - OSM data extracted to a variety of formats and
areas&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/GeoFabrik.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.geonames.org/" target="_blank" rel="noopener"
&gt;GeoNames Worldwide&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/GeoNames-Worldwide.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://gadm.org/" target="_blank" rel="noopener"
&gt;Global Administrative Areas Database (GADM) - Geospatial data
organized by country. Includes \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/Global-Administrative-Areas-Database-GADM.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://hifld-geoplatform.opendata.arcgis.com/" target="_blank" rel="noopener"
&gt;Homeland Infrastructure Foundation-Level
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/Homeland-Infrastructure-Foundation.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://aws.amazon.com/public-data-sets/landsat/" target="_blank" rel="noopener"
&gt;Landsat 8 on AWS&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/Landsat-8-on-AWS.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/umpirsky/country-list" target="_blank" rel="noopener"
&gt;List of all countries in all
languages&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/List-of-all-countries-in-all-languages.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.nws.noaa.gov/gis/" target="_blank" rel="noopener"
&gt;National Weather Service GIS Data
Portal&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/National-Weather-Service-GIS-Data-Portal.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.naturalearthdata.com/downloads/" target="_blank" rel="noopener"
&gt;Natural Earth - vectors and rasters of the
world&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/Natural-Earth.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://openaddresses.io/" target="_blank" rel="noopener"
&gt;OpenAddresses&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/OpenAddresses.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://wiki.openstreetmap.org/wiki/Downloading_data" target="_blank" rel="noopener"
&gt;OpenStreetMap
(OSM)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/OpenStreetMap-OSM.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://pleiades.stoa.org/" target="_blank" rel="noopener"
&gt;Pleiades - Gazetteer and graph of ancient
places&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/Pleiades.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/kno10/reversegeocode" target="_blank" rel="noopener"
&gt;Reverse Geocoder using OSM
data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/Reverse-Geocoder-using-OSM-data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://freegisdata.rtwilson.com" target="_blank" rel="noopener"
&gt;Robin Wilson - Free GIS Datasets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/Robin-Wilson-Free-GIS-Datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/VIDA-NYU/shadow-accrual-maps/" target="_blank" rel="noopener"
&gt;Shadow Accrual Maps - The repository contains the accumulated
shadow information for New York
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/Shadow-Accrual-Maps.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.census.gov/geo/maps-data/data/tiger-line.html" target="_blank" rel="noopener"
&gt;TIGER/Line - U.S. boundaries and
roads&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/TIGER-Line.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://efele.net/maps/tz/world/" target="_blank" rel="noopener"
&gt;TZ Timezones shapefile&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/TZ-Timezones-shapfiles.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/foursquare/twofishes" target="_blank" rel="noopener"
&gt;TwoFishes - Foursquare's coarse
geocoder&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/TwoFishes.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://geodata.grid.unep.ch/" target="_blank" rel="noopener"
&gt;UN Environmental Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/UN-Environmental-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://geonode.state.gov/layers/?limit=100&amp;amp;offset=0" target="_blank" rel="noopener"
&gt;World boundaries from the U.S. Department of
State&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/World-boundaries-from--the-U.S.-Department-of-State.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/mledoze/countries" target="_blank" rel="noopener"
&gt;World countries in multiple
formats&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/World-countries-in-multiple-formats.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://map-verse.github.io/Repository/" target="_blank" rel="noopener"
&gt;MAP-VERSE - MAP usability - Validated Empirical Research by
Systematic Evaluation - A curated
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//GIS/map-verse.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="government"&gt;Government
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://open.alberta.ca" target="_blank" rel="noopener"
&gt;Alberta, Province of Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Alberta-Province-of-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://opendata.antwerpen.be/datasets" target="_blank" rel="noopener"
&gt;Antwerp, Belgium&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Antwerp-Belgium.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://datos.gob.ar" target="_blank" rel="noopener"
&gt;Argentina (non official)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Argentina-non-official.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://datos.gob.ar/" target="_blank" rel="noopener"
&gt;Datos Argentina - Portal de datos abiertos de la República
Argentina. Encontrá datos públicos \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Argentina.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.austintexas.gov/" target="_blank" rel="noopener"
&gt;Austin, TX, US&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Austin-TX-US.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.abs.gov.au/AUSSTATS/abs@.nsf/DetailsPage/3301.02009?OpenDocument" target="_blank" rel="noopener"
&gt;Australia
(abs.gov.au)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Australia-abs.gov.au.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.gov.au/" target="_blank" rel="noopener"
&gt;Australia (data.gov.au)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Australia-data.gov.au.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.data.gv.at/" target="_blank" rel="noopener"
&gt;Austria (data.gv.at)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Austria-data.gv.at.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.brla.gov/" target="_blank" rel="noopener"
&gt;Baton Rouge, LA, US&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Baton-Rouge-LA-US.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.beer-sheva.muni.il/OpenData/Pages/default.aspx" target="_blank" rel="noopener"
&gt;Beersheba, Israel - Open Data Portal (Smart7
OpenData)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Beersheba-Israel.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.gov.be/" target="_blank" rel="noopener"
&gt;Belgium&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Belgium.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.cityofberkeley.info/" target="_blank" rel="noopener"
&gt;City of Berkeley Open Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Berkeley-CA-Open-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://dados.gov.br/dados/conjuntos-dados" target="_blank" rel="noopener"
&gt;Brazil&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Brazil.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.buenosaires.gob.ar/" target="_blank" rel="noopener"
&gt;Buenos Aires, Argentina&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Buenos-Aires-Argentina.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.calgary.ca/" target="_blank" rel="noopener"
&gt;Calgary, AB, Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Calgary-AB-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.cambridgema.gov/" target="_blank" rel="noopener"
&gt;Cambridge, MA, US&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Cambridge-MA-US.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://open.canada.ca/" target="_blank" rel="noopener"
&gt;Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.cityofchicago.org/" target="_blank" rel="noopener"
&gt;Chicago&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Chicago.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://datos.gob.cl/dataset" target="_blank" rel="noopener"
&gt;Chile&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Chile.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.stats.gov.cn/english/" target="_blank" rel="noopener"
&gt;China&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/China)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.dallasopendata.com/" target="_blank" rel="noopener"
&gt;Dallas Open Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Dallas-Open-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.data.gov.bc.ca/" target="_blank" rel="noopener"
&gt;DataBC - data from the Province of British
Columbia&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/DataBC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://fiscaldata.treasury.gov/datasets/debt-to-the-penny/debt-to-the-penny" target="_blank" rel="noopener"
&gt;Debt to the Penny - The Debt to the Penny dataset provides
information about the total
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Debt-to-penny.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.denvergov.org//" target="_blank" rel="noopener"
&gt;Denver Open Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Denver-Open-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://live-durhamnc.opendata.arcgis.com/" target="_blank" rel="noopener"
&gt;Durham, NC Open Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Durham-NC-Open-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.edmonton.ca/" target="_blank" rel="noopener"
&gt;Edmonton, AB, Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Edmonton-AB-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://lginform.local.gov.uk/" target="_blank" rel="noopener"
&gt;England LGInform&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/England-LGInform.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://ec.europa.eu/eurostat/data/database" target="_blank" rel="noopener"
&gt;EuroStat&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/EuroStat.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://everypolitician.org/" target="_blank" rel="noopener"
&gt;EveryPolitician - Ongoing project collating and sharing data on
every politician.&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/EveryPolitician.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://nces.ed.gov/FCSM/index.asp" target="_blank" rel="noopener"
&gt;Federal Committee on Statistical Methodology (FCSM) (formerly
FedStats)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/FedStats.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.opendata.fi/en" target="_blank" rel="noopener"
&gt;Finland&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Finland.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.data.gouv.fr/en/datasets/" target="_blank" rel="noopener"
&gt;France&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/France.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.fredericton.ca/en/citygovernment/Catalogue.asp" target="_blank" rel="noopener"
&gt;Fredericton, NB,
Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Fredericton-NB-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.gatineau.ca/donneesouvertes/default_fr.aspx" target="_blank" rel="noopener"
&gt;Gatineau, QC,
Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Gatineau-QC-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www-genesis.destatis.de/genesis/online" target="_blank" rel="noopener"
&gt;Germany&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Germany.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.stad.gent/explore" target="_blank" rel="noopener"
&gt;Ghent, Belgium&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Ghent-Belgium.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.glasgow.gov.uk/" target="_blank" rel="noopener"
&gt;Glasgow, Scotland, UK&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Glasgow-Scotland-UK.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.data.gov.gr/" target="_blank" rel="noopener"
&gt;Greece&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Greece.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.guardian.co.uk/world-government-data" target="_blank" rel="noopener"
&gt;Guardian world
governments&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Guardian-world-governments.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.halifax.ca/home/open-data" target="_blank" rel="noopener"
&gt;Halifax, NS, Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Halifax-NS-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.hri.fi/en/" target="_blank" rel="noopener"
&gt;Helsinki Region, Finland&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Helsinki-Region-Finland.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.gov.hk/en/" target="_blank" rel="noopener"
&gt;Hong Kong, China&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Hong-Kong-China.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.houstontx.gov/" target="_blank" rel="noopener"
&gt;Houston, TX, US&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Houston-TX-US.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.gov.in/" target="_blank" rel="noopener"
&gt;Indian Government Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Indian-Government-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.go.id/" target="_blank" rel="noopener"
&gt;Indonesian Data Portal&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Indonesian-Data-Portal.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.iowa.gov/" target="_blank" rel="noopener"
&gt;Iowa - Welcome to the State of Iowa's data portal. Please explore
data about Iowa and your \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Iowa.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.gov.ie/data" target="_blank" rel="noopener"
&gt;Ireland's Open Data Portal&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Irelands-Open-Data-Portal.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.gov.il" target="_blank" rel="noopener"
&gt;Israel's Open Data Portal&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Israel.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.ibb.gov.tr" target="_blank" rel="noopener"
&gt;Istanbul Municipality Open Data Portal&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Istanbul-Municipality-Open-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.dati.gov.it/" target="_blank" rel="noopener"
&gt;Italy - Il Portale dati.gov.it è il catalogo nazionale dei metadati
relativi ai dati \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Italy.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.reuters.com/investigates/special-report/usa-jails-graphic/" target="_blank" rel="noopener"
&gt;Jail deaths in America - The U.S. government does not release jail
by jail mortality data,
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Jail-deaths-in-America.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.e-stat.go.jp/SG1/estat/eStatTopPortalE.do" target="_blank" rel="noopener"
&gt;Japan&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Japan.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.laval.ca/Pages/Fr/Citoyens/donnees.aspx" target="_blank" rel="noopener"
&gt;Laval, QC,
Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Laval-QC-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.lexingtonky.gov/" target="_blank" rel="noopener"
&gt;Lexington, KY&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Lexington-KY.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://data.london.gov.uk/dataset" target="_blank" rel="noopener"
&gt;London Datastore, UK&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/London-Datastore-UK.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.london.ca/city-hall/open-data/Pages/default.aspx" target="_blank" rel="noopener"
&gt;London, ON,
Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/London-ON-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.lacity.org/" target="_blank" rel="noopener"
&gt;Los Angeles Open Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Los-Angeles-Open-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.public.lu/en/" target="_blank" rel="noopener"
&gt;Luxembourg - Luxembourgish Open Data
Portal&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Luxembourg.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.gov.my/" target="_blank" rel="noopener"
&gt;Malaysia&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Malaysia.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.mass.gov/anf/research-and-tech/it-serv-and-support/application-serv/office-of-geographic-information-massgis/" target="_blank" rel="noopener"
&gt;MassGIS, Massachusetts,
U.S.&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/MassGIS-Massachusetts-U.S..yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://mtc.ca.gov/tools-resources/data-tools/open-data-library" target="_blank" rel="noopener"
&gt;Metropolitan Transportation Commission (MTC), California,
US&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Metropolitain-Transportation-Commission-MTC-California-US.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://datos.gob.mx/busca/dataset" target="_blank" rel="noopener"
&gt;Mexico&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Mexico.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.mississauga.ca/portal/residents/publicationsopendatacatalogue" target="_blank" rel="noopener"
&gt;Mississauga, ON,
Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Missisauga-ON-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.gov.md/" target="_blank" rel="noopener"
&gt;Moldova&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Moldova.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://open.moncton.ca/" target="_blank" rel="noopener"
&gt;Moncton, NB, Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Moncton-NB-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://donnees.ville.montreal.qc.ca/" target="_blank" rel="noopener"
&gt;Montreal, QC, Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Montreal-QC-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data-mountainview.opendata.arcgis.com/" target="_blank" rel="noopener"
&gt;Mountain View, California, US
(GIS)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Mountain-View-California-US-GIS.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://opendata.cityofnewyork.us/" target="_blank" rel="noopener"
&gt;NYC Open Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/NYC-Open-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://betanyc.us/" target="_blank" rel="noopener"
&gt;NYC betanyc&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/NYC-betanyc.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.overheid.nl/" target="_blank" rel="noopener"
&gt;Netherlands&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Netherlands.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.cityofnewyork.us/City-Government/DSNY-Monthly-Tonnage-Data/ebb7-mvp5" target="_blank" rel="noopener"
&gt;New York Department of Sanitation Monthly Tonnage - DSNY Monthly
Tonnage Data provides
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/New-York-Department-of-Sanitation.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.stats.govt.nz/browse_for_stats.aspx" target="_blank" rel="noopener"
&gt;New Zealand&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/New-Zealand.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://data.oecd.org/" target="_blank" rel="noopener"
&gt;OECD&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/OECD.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://data.oaklandnet.com/" target="_blank" rel="noopener"
&gt;Oakland, California, US&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Oakland-California-US.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.ok.gov/" target="_blank" rel="noopener"
&gt;Oklahoma&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Oklahoma.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://opendataforafrica.org/" target="_blank" rel="noopener"
&gt;Open Data for Africa&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Open-Data-for-Africa.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.gov.in/" target="_blank" rel="noopener"
&gt;Open Government Data (OGD) Platform India&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Open-Government-Data-OGD-Platform-India.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.opendatasoft.com/blog/2015/11/02/how-we-put-together-a-list-of-1600-open-data-portals-around-the-world-to-help-open-data-community" target="_blank" rel="noopener"
&gt;OpenDataSoft's list of 1,600 open
data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/OpenDataSofts-list-of-1600-open-data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.oregon.gov/" target="_blank" rel="noopener"
&gt;Oregon&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Oregon.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.ottawa.ca/en/" target="_blank" rel="noopener"
&gt;Ottawa, ON, Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Ottawa-ON-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://data.cityofpaloalto.org/home" target="_blank" rel="noopener"
&gt;Palo Alto, California, US&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Palo-Alto-California-US.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.opendataphilly.org/" target="_blank" rel="noopener"
&gt;OpenDataPhilly - OpenDataPhilly is a catalog of open data in the
Philadelphia region. In \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Philadelphia-Open-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.portlandoregon.gov/28130" target="_blank" rel="noopener"
&gt;Portland, Oregon&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Portland-Oregon.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.pordata.pt/en/Home" target="_blank" rel="noopener"
&gt;Portugal - Pordata organization&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Portugal.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://data.pr.gov//" target="_blank" rel="noopener"
&gt;Puerto Rico Government&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Puerto-Rico-Government.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://donnees.ville.quebec.qc.ca/" target="_blank" rel="noopener"
&gt;Quebec City, QC, Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Quebec-City-QC-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.donneesquebec.ca/en/" target="_blank" rel="noopener"
&gt;Quebec Province of Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Quebec-Province-of-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://open.regina.ca/" target="_blank" rel="noopener"
&gt;Regina SK, Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Regina-SK-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.data.rio/" target="_blank" rel="noopener"
&gt;Rio de Janeiro, Brazil&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Rio-de-Janeiro-Brazil.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.gov.ro/" target="_blank" rel="noopener"
&gt;Romania&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Romania.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://data.gov.ru" target="_blank" rel="noopener"
&gt;Russia&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Russia.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.sandiego.gov" target="_blank" rel="noopener"
&gt;San Diego, CA&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/San%20Diego,%20CA.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://cinow.info/" target="_blank" rel="noopener"
&gt;San Antonio, TX - Community Information Now - CI:Now is a nonprofit
serving Bexar (San \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/San-Antonio-TX-US-Community-Information-Now.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://datasf.org/" target="_blank" rel="noopener"
&gt;San Francisco Data sets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/San-Francisco-Data-sets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.sanjoseca.gov/" target="_blank" rel="noopener"
&gt;San Jose, California, US&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/San-Jose-California-US.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.smcgov.org/" target="_blank" rel="noopener"
&gt;San Mateo County, California, US&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/San-Mateo-County-California-US.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://opendatask.ca/data/" target="_blank" rel="noopener"
&gt;Saskatchewan, Province of Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Saskatchewan-Province-of-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.seattle.gov/" target="_blank" rel="noopener"
&gt;Seattle&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Seattle.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.gov.sg/" target="_blank" rel="noopener"
&gt;Singapore Government Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Singapore-Government-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.econostatistics.co.za/" target="_blank" rel="noopener"
&gt;South Africa Trade Statistics&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/South-Africa-Trade-Statistics.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.statssa.gov.za/" target="_blank" rel="noopener"
&gt;South Africa&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/South-Africa.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://opendata.utah.gov/" target="_blank" rel="noopener"
&gt;State of Utah, US&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/State-of-Utah-US.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.opendata.admin.ch/" target="_blank" rel="noopener"
&gt;Switzerland&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Switzerland.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.gov.tw/" target="_blank" rel="noopener"
&gt;Taiwan gov&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Taiwan-g0v.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.gov.tw/" target="_blank" rel="noopener"
&gt;Taiwan&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Taiwan.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://opendata.tel-aviv.gov.il/en/Pages/home.aspx" target="_blank" rel="noopener"
&gt;Tel-Aviv Open
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Tel-Aviv.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.texas.gov/" target="_blank" rel="noopener"
&gt;Texas Open Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Texas-Open-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://openknowledge.worldbank.org/handle/10986/2124" target="_blank" rel="noopener"
&gt;The World
Bank&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/The-World-Bank.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://open.toronto.ca/" target="_blank" rel="noopener"
&gt;Toronto, ON, Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Toronto-ON-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.data.gov.tn/" target="_blank" rel="noopener"
&gt;Tunisia&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Tunisia.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.gov.uk" target="_blank" rel="noopener"
&gt;U.K. Government Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/U.K.-Government-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.census.gov/programs-surveys/acs/" target="_blank" rel="noopener"
&gt;U.S. American Community
Survey&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/U.S.-American-Community-Survey.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.cdc.gov/nchs/data_access/ftp_data.htm" target="_blank" rel="noopener"
&gt;U.S. CDC Public Health
datasets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/U.S.-CDC-Public-Health-datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.census.gov/data.html" target="_blank" rel="noopener"
&gt;U.S. Census Bureau&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/U.S.-Census-Bureau.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.huduser.gov/portal/datasets/pdrdatas.html" target="_blank" rel="noopener"
&gt;U.S. Department of Housing and Urban Development
(HUD)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/U.S.-Department-of-Housing-and-Urban-Development-HUD.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.data.gov/metrics" target="_blank" rel="noopener"
&gt;U.S. Federal Government Agencies&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/U.S.-Federal-Government-Agencies.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://catalog.data.gov/dataset" target="_blank" rel="noopener"
&gt;U.S. Federal Government Data
Catalog&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/U.S.-Federal-Government-Data-Catalog.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://open.fda.gov/index.html" target="_blank" rel="noopener"
&gt;U.S. Food and Drug Administration
(FDA)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/U.S.-Food-and-Drug-Administration-FDA.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://nces.ed.gov/" target="_blank" rel="noopener"
&gt;U.S. National Center for Education Statistics
(NCES)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/U.S.-National-Center-for-Education-Statistics-NCES.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.data.gov/open-gov/" target="_blank" rel="noopener"
&gt;U.S. Open Government&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/U.S.-Open-Government.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.cdrc.ac.uk/product/cdrc-2011-census-open-atlas" target="_blank" rel="noopener"
&gt;UK 2011 Census Open Atlas
Project&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/UK-2011-Census-Open-Atlas-Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.unesco.org" target="_blank" rel="noopener"
&gt;UNESCO Data Hub - UNESCO's official data catalog providing
authoritative global statistics \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/UNESCO-Data-Hub.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/evangambit/JsonOfCounties" target="_blank" rel="noopener"
&gt;US Counties - This is a repository of various data, broken down by
US county. While most of
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/US-Counties.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.uspto.gov/learning-and-resources/bulk-data-products" target="_blank" rel="noopener"
&gt;U.S. Patent and Trademark Office (USPTO) Bulk Data
Products&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/USPTO-Bulk-Data-Products.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.ubos.org/unda/index.php/catalog" target="_blank" rel="noopener"
&gt;Uganda Bureau of
Statistics&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Uganda-Bureau-of-Statistics.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.gov.ua/" target="_blank" rel="noopener"
&gt;Ukraine&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Ukraine.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.un.org/" target="_blank" rel="noopener"
&gt;United Nations&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/United-Nations.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://catalogodatos.gub.uy/" target="_blank" rel="noopener"
&gt;Uruguay&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Uruguay.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.vta.org/" target="_blank" rel="noopener"
&gt;Valley Transportation Authority (VTA), California,
US&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Valley-Transportation-Authority-VTA-California-US.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.vancouver.ca/datacatalogue/" target="_blank" rel="noopener"
&gt;Vancouver, BC Open Data
Catalog&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Vancouver-BC-Open-Data-Catalog.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://opendata.victoria.ca/" target="_blank" rel="noopener"
&gt;Victoria, BC, Canada&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Victoria-BC-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://open.wien.gv.at/site/open-data/" target="_blank" rel="noopener"
&gt;Vienna, Austria&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Vienna-Austria.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.gso.gov.vn/Default_en.aspx?tabid=491" target="_blank" rel="noopener"
&gt;Statistics from the General Statistics Office of Vietnam - Data in
different categories are
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/Vietnam.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.everycrsreport.com/" target="_blank" rel="noopener"
&gt;U.S. Congressional Research Service (CRS)
Reports&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Government/everycrsreport.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="healthcare"&gt;Healthcare
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://dj2taa9i652rf.cloudfront.net/" target="_blank" rel="noopener"
&gt;AWS COVID-19 Datasets - We're working with organizations who make
COVID-19-related data
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/Aws-COVID-19.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.cdc.gov/Case-Surveillance/COVID-19-Case-Surveillance-Public-Use-Data/vbim-akqf" target="_blank" rel="noopener"
&gt;COVID-19 Case Surveillance Public Use Data - The COVID-19 case
surveillance system database
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/COVID-19-Case-Surveillance-Public-Use-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/andrab/ecuacovid" target="_blank" rel="noopener"
&gt;Covid-19 non-processed data of Ecuador - It's a project which
provides non-processed datasets
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/COVID-19-Ecuador-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/CSSEGISandData/COVID-19" target="_blank" rel="noopener"
&gt;2019 Novel Coronavirus COVID-19 Data Repository by Johns Hopkins
CSSE - This is the data
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/COVID-19-Johns-Hopkins.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/nytimes/covid-19-data" target="_blank" rel="noopener"
&gt;Coronavirus (Covid-19) Data in the United States - The New York
Times is releasing a series
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/COVID-19-New-York-Times.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://healthdata.gov/dataset/covid-19-reported-patient-impact-and-hospital-capacity-facility?SorourMo/38-Cloud-A-Cloud-Segmentation-Dataset" target="_blank" rel="noopener"
&gt;COVID-19 Reported Patient Impact and Hospital Capacity by
Facility - The following dataset
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/COVID-19-Reported-Patient-Impact-and-Hospital-Capacity-by-Facility.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.nal.usda.gov/dataset/composition-foods-raw-processed-prepared-usda-national-nutrient-database-standard-reference-release-27" target="_blank" rel="noopener"
&gt;Composition of Foods Raw, Processed, Prepared USDA National
Nutrient Database for Standard
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/Composition-of-Foods-Raw-Processed-Prepared-USDA-National-Nutrient-Database-for-Standard-Reference.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://covidtracking.com/data" target="_blank" rel="noopener"
&gt;The COVID Tracking Project - The COVID Tracking Project collects
and publishes the most \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/Covid-Tracking-Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.ehdp.com/vitalnet/datasets.htm" target="_blank" rel="noopener"
&gt;EHDP Large Health Data
Sets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/EHDP-Large-Health-Data-Sets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://gdc.cancer.gov/" target="_blank" rel="noopener"
&gt;GDC - GDC supports several cancer genome programs for CCG, TCGA,
TARGET etc.&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/GDC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.gapminder.org/data/" target="_blank" rel="noopener"
&gt;Gapminder World demographic
databases&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/Gapminder-World-demographic-databases.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.nlm.nih.gov/mesh/filelist.html" target="_blank" rel="noopener"
&gt;MeSH, the vocabulary thesaurus used for indexing articles for
PubMed&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/MeSH-the-vocabulary-thesaurus-used-for-indexing-articles-for-PubMed.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/BruceWen120/medal" target="_blank" rel="noopener"
&gt;MeDAL - A large medical text dataset curated for abbreviation
disambiguation - Medical
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/Medal-medical-abbreviations.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.cms.gov/medicare-coverage-database/" target="_blank" rel="noopener"
&gt;Medicare Coverage Database (MCD),
U.S.&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/Medicare-Coverage-Database-MCD-U.S..yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://data.medicare.gov/" target="_blank" rel="noopener"
&gt;Medicare Data Engine of medicare.gov
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/Medicare-Data-Engine-of-medicare.gov-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://go.cms.gov/19xxPN4" target="_blank" rel="noopener"
&gt;Medicare Data File&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/Medicare-Data-File.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://docs.nightingalescience.org/" target="_blank" rel="noopener"
&gt;Nightingale Open Science&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/Nightingale.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.humdata.org/dataset/ebola-cases-2014" target="_blank" rel="noopener"
&gt;Number of Ebola Cases and Deaths in Affected Countries
(2014)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/Number-of-Ebola-Cases-and-Deaths-in-Affected-Countries-2014.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.openods.co.uk" target="_blank" rel="noopener"
&gt;Open-ODS (structure of the UK NHS)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/Open-ODS.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://openpaymentsdata.cms.gov" target="_blank" rel="noopener"
&gt;OpenPaymentsData, Healthcare financial relationship
data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/OpenPaymentsData-Healthcare-financial-relationship-data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.physionet.org/physiobank/database/" target="_blank" rel="noopener"
&gt;PhysioBank Databases - A large and growing archive of physiological
data.&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/PhysioBank-Databases.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.kaggle.com/datasets/thepodclub/spanish-flu-dataset" target="_blank" rel="noopener"
&gt;Spanish Flu Dataset - Historical dataset about the 1918&amp;ndash;1920
Spanish Flu pandemic, including
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/Spanish-Flu.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.cancerimagingarchive.net" target="_blank" rel="noopener"
&gt;The Cancer Imaging Archive
(TCIA)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/TCIA.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://portal.gdc.cancer.gov/" target="_blank" rel="noopener"
&gt;The Cancer Genome Atlas project
(TCGA)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/The-Cancer-Genome-Atlas-project-TCGA.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.who.int/gho/en/" target="_blank" rel="noopener"
&gt;World Health Organization Global Health
Observatory&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/World-Health-Organization-Global-Health-Observatory.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/yahoo/covid-19-data" target="_blank" rel="noopener"
&gt;Yahoo Knowledge Graph COVID-19 Datasets - The Yahoo Knowledge Graph
team at Verizon Media is
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/Yahoo-COVID-19.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.i2b2.org/NLP/DataSets/Main.php" target="_blank" rel="noopener"
&gt;Informatics for Integrating Biology and the
Bedside&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Healthcare/i2b2.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="imageprocessing"&gt;ImageProcessing
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://wilmabainbridge.com/facememorability2.html" target="_blank" rel="noopener"
&gt;10k US Adult Faces
Database&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/10k-US-Adult-Faces-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.kaggle.com/crawford/cat-dataset/version/2" target="_blank" rel="noopener"
&gt;2GB of Photos of
Cats&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/2GB-of-Photos-of-Cats.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.openu.ac.il/home/hassner/Adience/data.html" target="_blank" rel="noopener"
&gt;Audience Unfiltered faces for gender and age
classification&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Adience-Unfiltered-faces-for-gender-and-age-classification.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.imageemotion.org/" target="_blank" rel="noopener"
&gt;Affective Image Classification&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Affective-Image-Classification.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.aicrowd.com/challenges/airborne-object-tracking-challenge" target="_blank" rel="noopener"
&gt;Airborne Object Detection and Tracking - The Airborne Object
Tracking (AOT) dataset is a
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Airborne-Object-Detection-and-Tracking.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://attributes.kyb.tuebingen.mpg.de/" target="_blank" rel="noopener"
&gt;Animals with attributes&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Animals-with-attributes.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://caddy-underwater-datasets.ge.issia.cnr.it/" target="_blank" rel="noopener"
&gt;CADDY Underwater Stereo-Vision Dataset of divers' hand gestures -
Contains 10K stereo pair
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/CADDY-Underwater-Stereo-Vision-Dataset-of-hand-gestures.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://doi.org/10.17632/wg4bpm33hj.2" target="_blank" rel="noopener"
&gt;Cytology Dataset &amp;ndash; CCAgT: Images of Cervical Cells with AgNOR
Stain Technique - Contains 9339
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/CCAgT.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.vision.caltech.edu/Image_Datasets/CaltechPedestrians/" target="_blank" rel="noopener"
&gt;Caltech Pedestrian Detection
Benchmark&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Caltech-Pedestrian-Detection-Benchmark.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.ee.surrey.ac.uk/CVSSP/demos/chars74k/" target="_blank" rel="noopener"
&gt;Chars74K dataset - Character Recognition in Natural Images (both
English and Kannada are
available)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Chars74K-dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/Visillect/CubePlusPlus" target="_blank" rel="noopener"
&gt;Cube++ - 4890 raw 18-megapixel images, each containing a SpyderCube
color target in their
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Cube-Plus-Plus.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://mediatum.ub.tum.de/1596437" target="_blank" rel="noopener"
&gt;Densely Annotated Video Driving Data Set - This data set consists
of 28 video sequences of
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/DAVID.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.gwern.net/Danbooru" target="_blank" rel="noopener"
&gt;Danbooru Tagged Anime Illustration Dataset - A large-scale anime
image database with 3.33m+ \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Danbooru-Tagged-Anime-Illustration-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://vision.cs.duke.edu/DukeMTMC/" target="_blank" rel="noopener"
&gt;DukeMTMC Data Set - DukeMTMC aims to accelerate advances in
multi-target multi-camera
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/DukeMTMC-Data-Set.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://doi.org/10.3929/ethz-b-000365379" target="_blank" rel="noopener"
&gt;ETH Entomological Collection (ETHEC) Fine Grained Butterfly
(Lepidoptra) Images&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/ETH_Entomological_Collection_Fine_Grained_Butterfly_Images.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.face-rec.org/databases/" target="_blank" rel="noopener"
&gt;Face Recognition Benchmark&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Face-Recognition-Benchmark.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.multimedia-computing.de/flickrlogos/" target="_blank" rel="noopener"
&gt;Flickr: 32 Class Brand
Logos&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Flickr-32-Class-Brand-Logos.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://dmery.ing.puc.cl/index.php/material/gdxray/" target="_blank" rel="noopener"
&gt;GDXray - X-ray images for X-ray testing and Computer
Vision&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/GDXray.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://humaneva.is.tue.mpg.de/" target="_blank" rel="noopener"
&gt;HumanEva Dataset - The HumanEva-I dataset contains 7 calibrated
video sequences (4 grayscale
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/HumanEva-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.image-net.org/" target="_blank" rel="noopener"
&gt;ImageNet (in WordNet hierarchy)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/ImageNet.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://web.mit.edu/torralba/www/indoor.html" target="_blank" rel="noopener"
&gt;Indoor Scene
Recognition&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Indoor-Scene-Recognition.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://csea.phhp.ufl.edu/media/iapsmessage.html" target="_blank" rel="noopener"
&gt;International Affective Picture System,
UFL&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/International-Affective-Picture-System-UFL.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cvlibs.net/datasets/kitti/" target="_blank" rel="noopener"
&gt;KITTI Vision Benchmark
Suite&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/KITTI-Vision-Benchmark-Suite.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://lila.science" target="_blank" rel="noopener"
&gt;Labeled Information Library of Alexandria - Biology and
Conservation - Contains over 10 \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/LILA-BC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://dx.doi.org/10.21227/6htp-py25" target="_blank" rel="noopener"
&gt;Long duration stitched and unstitched 8K/30 fps stereoscopic 360°
videos - This 360° video
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Long-duration-stitched-and-unstitched-8K-30-fps-stereoscopic-360deg-videos.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://yann.lecun.com/exdb/mnist/" target="_blank" rel="noopener"
&gt;MNIST database of handwritten digits, near 1 million
examples&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/MNIST-database-of-handwritten-digits-near-1-million-examples.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://mediatum.ub.tum.de/1548761" target="_blank" rel="noopener"
&gt;Multi-View Region of Interest Prediction Dataset for Autonomous
Driving - Contains 16 driving
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/MV-ROI.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://olivalab.mit.edu/MM/stimuli.html" target="_blank" rel="noopener"
&gt;Massive Visual Memory Stimuli,
MIT&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Massive-Visual-Memory-Stimuli-MIT.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://news-navigator.labs.loc.gov/" target="_blank" rel="noopener"
&gt;Newspaper Navigator - This dataset consists of extracted visual
content for 16,358,041
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Newspaper-Navigator.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://storage.googleapis.com/openimages/web/download.html" target="_blank" rel="noopener"
&gt;Open Images From Google - Pictures with segmentation masks for 2.8
million object instances
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/OpenImagesByGoogle.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/mhmoodlan/arabic-font-classification/releases/tag/v0.1.0" target="_blank" rel="noopener"
&gt;RuFa - Contains images of text written in one of two Arabic fonts
(Ruqaa and Nastaliq
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/RuFa-Arabic-font-dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://groups.csail.mit.edu/vision/SUN/hierarchy.html" target="_blank" rel="noopener"
&gt;SUN database,
MIT&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/SUN-database-MIT.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://sviro.kl.dfki.de" target="_blank" rel="noopener"
&gt;SVIRO Synthetic Vehicle Interior Rear Seat Occupancy - 25.000
synthetic scenery's across ten \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/SVIRO.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://kaiwolf.no-ip.org/3d-model-repository.html" target="_blank" rel="noopener"
&gt;Several Shape-from-Silhouette
Datasets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Several-Shape-from-Silhouette-Datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://vision.stanford.edu/aditya86/ImageNetDogs/" target="_blank" rel="noopener"
&gt;Stanford Dogs
Dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Stanford-Dogs-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.openu.ac.il/home/hassner/data/ASLAN/ASLAN.html" target="_blank" rel="noopener"
&gt;The Action Similarity Labeling (ASLAN)
Challenge&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/The-Action-Similarity-Labeling-ASLAN-Challenge.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.robots.ox.ac.uk/~vgg/data/pets/" target="_blank" rel="noopener"
&gt;The Oxford-IIIT Pet
Dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/The-Oxford-IIIT-Pet-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.openu.ac.il/home/hassner/data/violentflows/" target="_blank" rel="noopener"
&gt;Violent-Flows - Crowd Violence / Non-violence Database and
benchmark&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Violent-Flows.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://visualgenome.org/api/v0/api_home.html" target="_blank" rel="noopener"
&gt;Visual genome&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/Visual-genome.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.cs.tau.ac.il/~wolf/ytfaces/" target="_blank" rel="noopener"
&gt;YouTube Faces Database&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ImageProcessing/YouTube-Faces-Database.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="machinelearning"&gt;MachineLearning
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/JingchunCheng/All-Age-Faces-Dataset" target="_blank" rel="noopener"
&gt;All-Age-Faces Dataset - Contains 13'322 Asian face images
distributed across all ages (from 2
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/All-Age-Faces-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.a2d2.audi/a2d2/en.html" target="_blank" rel="noopener"
&gt;Audi Autonomous Driving Dataset - We have published the Audi
Autonomous Driving Dataset
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Audi-Autonomous-Driving-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/kbesenic/B3FD" target="_blank" rel="noopener"
&gt;B3FD - Facial age (and gender) estimation dataset with 375k
images - The B3FD dataset is a
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Biometrically-Filtered-Famous-Figure-Dataset-for-Age-Estimation.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/irecsys/CARSKit/tree/master/context-aware_data_sets" target="_blank" rel="noopener"
&gt;Context-aware data sets from five
domains&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Context-aware-datasets-from-five-domains.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cs.toronto.edu/~delve/data/datasets.html" target="_blank" rel="noopener"
&gt;Delve Datasets for classification and
regression&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Delve-Datasets-for-classification-and-regression.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.discogs.com/" target="_blank" rel="noopener"
&gt;Discogs Monthly Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Discogs-Monthly-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://amsacta.unibo.it/id/eprint/6706" target="_blank" rel="noopener"
&gt;Fluorescent Neuronal Cells - By releasing this dataset, we aim at
providing a new testbed for
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Fluorescent-Neuronal-Cells.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/mdeff/fma" target="_blank" rel="noopener"
&gt;Free Music Archive&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Free-Music-Archive.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.imdb.com/interfaces" target="_blank" rel="noopener"
&gt;IMDb Database&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/IMDb-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://alitourani.github.io/Iranis-dataset/" target="_blank" rel="noopener"
&gt;Iranis - A Large-scale Dataset of Farsi/Arabic License Plate
Characters&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Iranis.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://sci2s.ugr.es/keel/datasets.php" target="_blank" rel="noopener"
&gt;Keel Repository for classification, regression and time
series&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Keel-Repository-for-classification-regression-and-time-series.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://bupt-ai-cz.github.io/LLVIP/" target="_blank" rel="noopener"
&gt;LLVIP - This dataset contains 30976 images, or 15488 pairs, most of
which were taken at very
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/LLVIP.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://vis-www.cs.umass.edu/lfw/" target="_blank" rel="noopener"
&gt;Labeled Faces in the Wild (LFW)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Labeled-Faces-in-the-Wild-LFW.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.lendingclub.com/info/download-data.action" target="_blank" rel="noopener"
&gt;Lending Club Loan
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Lending-Club-Loan-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://mldata.org/" target="_blank" rel="noopener"
&gt;Machine Learning Data Set Repository&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Machine-Learning-Data-Set-Repository.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://labrosa.ee.columbia.edu/millionsong/" target="_blank" rel="noopener"
&gt;Million Song Dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Million-Song-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://labrosa.ee.columbia.edu/millionsong/pages/additional-datasets" target="_blank" rel="noopener"
&gt;More Song
Datasets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/More-Song-Datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://grouplens.org/datasets/movielens/" target="_blank" rel="noopener"
&gt;MovieLens Data Sets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/MovieLens-Data-Sets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/nextml/caption-contest-data" target="_blank" rel="noopener"
&gt;New Yorker caption contest
ratings&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/New-Yorker-caption-contest-ratings.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.rdatamining.com/data" target="_blank" rel="noopener"
&gt;RDataMining - &amp;quot;R and Data Mining&amp;quot; ebook
data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/RDataMining.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://publichealthintelligence.org/content/registered-meteorites-has-impacted-earth-visualized" target="_blank" rel="noopener"
&gt;Registered Meteorites on
Earth&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Registered-Meteorites-on-Earth.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.sfgov.org/Health-and-Social-Services/Restaurant-Scores-LIVES-Standard/pyih-qa8i?row_index=0" target="_blank" rel="noopener"
&gt;Restaurants Health Score Data in San
Francisco&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Restaurants-Health-Score-Data-in-San-Francisco.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.yasamin.page/hdnet_tiktok" target="_blank" rel="noopener"
&gt;TikTok Dataset - More than 300 dance videos that capture a single
person performing dance
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Tik-Tok-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://archive.ics.uci.edu/ml/" target="_blank" rel="noopener"
&gt;UCI Machine Learning Repository&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/UCI-Machine-Learning-Repository.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://huggingface.co/datasets/yandex/yambda" target="_blank" rel="noopener"
&gt;Yambda-5B &amp;mdash; A Large-Scale Multi-modal Dataset for Ranking And
Retrieval - Industrial-scale
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/YaMBDa-5B-Music-Interaction-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://webscope.sandbox.yahoo.com/catalog.php?datatype=r" target="_blank" rel="noopener"
&gt;Yahoo! Ratings and Classification
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Yahoo-Ratings-and-Classification-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://research.google.com/youtube-bb/" target="_blank" rel="noopener"
&gt;YouTube-BoundingBoxes&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/YouTube-BoundingBoxes.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://research.google.com/youtube8m/download.html" target="_blank" rel="noopener"
&gt;Youtube 8m&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/Youtube-8m.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.modelingonlineauctions.com/datasets" target="_blank" rel="noopener"
&gt;eBay Online Auctions
(2012)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//MachineLearning/eBay-Online-Auctions-2012.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="museums"&gt;Museums
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://techno-science.ca/en/data.php" target="_blank" rel="noopener"
&gt;Canada Science and Technology Museums Corporation's Open
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Museums/Canada-Science-and-Technology-Museums-Corporations-Open-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/cooperhewitt/collection" target="_blank" rel="noopener"
&gt;Cooper-Hewitt's Collection
Database&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Museums/Cooper-Hewitt-Collection-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://metmuseum.github.io/" target="_blank" rel="noopener"
&gt;Metropolitan Museum of Art Collection
API&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Museums/Metropolitan-Museum-of-Art-Collection-API.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/artsmia/collection" target="_blank" rel="noopener"
&gt;Minneapolis Institute of Arts
metadata&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Museums/Minneapolis-Institute-of-Arts-metadata.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.nhm.ac.uk/" target="_blank" rel="noopener"
&gt;Natural History Museum (London) Data
Portal&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Museums/Natural-History-Museum-London-Data-Portal.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.rijksmuseum.nl/en/api" target="_blank" rel="noopener"
&gt;Rijksmuseum Historical Art
Collection&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Museums/Rijksmuseum-Historical-Art-Collection.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/tategallery/collection" target="_blank" rel="noopener"
&gt;Tate Collection
metadata&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Museums/Tate-Collection-metadata.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://vocab.getty.edu" target="_blank" rel="noopener"
&gt;The Getty vocabularies&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Museums/The-Getty-vocabularies.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="naturallanguage"&gt;NaturalLanguage
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/snkim/AutomaticKeyphraseExtraction/" target="_blank" rel="noopener"
&gt;Automatic Keyphrase
Extraction&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Automatic-Keyphrase-Extraction.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://datasets.quantumstat.com" target="_blank" rel="noopener"
&gt;The Big Bad NLP Database&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/BigBadNLPDatabase.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.synsig.org/index.php/Blizzard_Challenge_2018" target="_blank" rel="noopener"
&gt;Blizzard Challenge Speech - The speech + text data comes from
professional audiobooks
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Blizzard-Speech.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://u.cs.biu.ac.il/~koppel/BlogCorpus.htm" target="_blank" rel="noopener"
&gt;Blogger Corpus&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Blogger-Corpus.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://zenodo.org/record/4639616" target="_blank" rel="noopener"
&gt;CLiPS Stylometry Investigation
Corpus&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/CLiPS-Stylometry-Investigation-Corpus.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://lemurproject.org/clueweb09/FACC1/" target="_blank" rel="noopener"
&gt;ClueWeb09 FACC&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/ClueWeb09-FACC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://lemurproject.org/clueweb12/FACC1/" target="_blank" rel="noopener"
&gt;ClueWeb12 FACC&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/ClueWeb12-FACC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://databus.dbpedia.org/dbpedia/collections/latest-core" target="_blank" rel="noopener"
&gt;DBpedia - Structured data from
Wikipedia&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/DBpedia.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/LDNOOBW/List-of-Dirty-Naughty-Obscene-and-Otherwise-Bad-Words" target="_blank" rel="noopener"
&gt;Dirty Words - With millions of images in our library and billions
of user-submitted keywords,
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Dirty-Words.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.isi.edu/~lerman/downloads/flickr/flickr_taxonomies.html" target="_blank" rel="noopener"
&gt;Flickr Personal
Taxonomies&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Flickr-Personal-Taxonomies.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.freebase.com/" target="_blank" rel="noopener"
&gt;Freebase of people, places, and things&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Freebase-of-people-places-and-things.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://adrien.barbaresi.eu/corpora/speeches/" target="_blank" rel="noopener"
&gt;German Political Speeches Corpus - Collection of political speeches
from the German
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/German-Political-Speeches-Corpus.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://aws.amazon.com/datasets/google-books-ngrams/" target="_blank" rel="noopener"
&gt;Google Books Ngrams
(2.2TB)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Google-Books-Ngrams-2.2TB.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/google/mcafp" target="_blank" rel="noopener"
&gt;Google MC-AFP - Generated based on the public available Gigaword
dataset using Paragraph Vectors&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Google-MC-AFP.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://catalog.ldc.upenn.edu/LDC2006T13" target="_blank" rel="noopener"
&gt;Google Web 5gram (1TB,
2006)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Google-Web-5gram-1TB-2006.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.gutenberg.org/wiki/Gutenberg:Offline_Catalogs" target="_blank" rel="noopener"
&gt;Gutenberg eBooks
List&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Gutenberg-eBooks-List.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.isi.edu/natural-language/download/hansard/" target="_blank" rel="noopener"
&gt;Hansards text chunks of Canadian
Parliament&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Hansards-text-chunks-of-Canadian-Parliament.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://keithito.com/LJ-Speech-Dataset" target="_blank" rel="noopener"
&gt;LJ Speech - Speech dataset consisting of 13,100 short audio clips
of a single speaker reading
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/LJ-Speech.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.m-ailabs.bayern/en/the-mailabs-speech-dataset/" target="_blank" rel="noopener"
&gt;M-AILabs Speech - The M-AILABS Speech Dataset is the first large
dataset that we are
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/M-AILABS-Speech.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.msmarco.org/dataset.aspx" target="_blank" rel="noopener"
&gt;Microsoft MAchine Reading COmprehension Dataset (or MS
MARCO)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/MS-MARCO.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://mattr1.github.io/mctest/" target="_blank" rel="noopener"
&gt;Machine Comprehension Test (MCTest) of text from Microsoft
Research&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Machine-Comprehension-Test-MCTest-of-text-from-Microsoft-Research.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://statmt.org/wmt11/translation-task.html#download" target="_blank" rel="noopener"
&gt;Machine Translation of European
languages&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Machine-Translation-of-European-languages.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://oak.dcs.shef.ac.uk/msm2013/challenge.html" target="_blank" rel="noopener"
&gt;Making Sense of Microposts 2013 - Concept
Extraction&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Making-Sense-of-Microposts-2013.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://microposts2016.seas.upenn.edu/challenge.html" target="_blank" rel="noopener"
&gt;Making Sense of Microposts 2016 - Named Entity rEcognition and
Linking&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Making-Sense-of-Microposts-2016.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cs.jhu.edu/~mdredze/datasets/sentiment/" target="_blank" rel="noopener"
&gt;Multi-Domain Sentiment Dataset (version
2.0)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Multi-Domain-Sentiment-Dataset-version-2.0.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://huggingface.co/datasets/allenai/nllb" target="_blank" rel="noopener"
&gt;No Language Left Behind (NLLB - 200vo) - Dataset based on Meta's
metadata for mined bitext.
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/NoLanguageLeftBehindNLLB200vo.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://datashare.is.ed.ac.uk/handle/10283/2791" target="_blank" rel="noopener"
&gt;Noisy speech database for training speech enhancement algorithms
and TTS models - Clean and
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Noisy-Speech.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://compling.hss.ntu.edu.sg/omw/" target="_blank" rel="noopener"
&gt;Open Multilingual Wordnet&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Open-Multilingual-Wordnet.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/aritter/twitter_nlp/tree/master/data/annotated" target="_blank" rel="noopener"
&gt;POS/NER/Chunk annotated
data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/POS-NER-Chunk-annotated-data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.clips.uantwerpen.be/datasets/personae-corpus" target="_blank" rel="noopener"
&gt;Personae
Corpus&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Personae-Corpus.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.dt.fee.unicamp.br/~tiago/smsspamcollection/" target="_blank" rel="noopener"
&gt;SMS Spam Collection in
English&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/SMS-Spam-Collection-in-English.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/ParallelMazen/SaudiNewsNet" target="_blank" rel="noopener"
&gt;SaudiNewsNet Collection of Saudi Newspaper Articles (Arabic, 30K
articles)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/SaudiNewsNet-Collection-of-Saudi-Newspaper-Articles-Arabic-30K-articles.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://rajpurkar.github.io/SQuAD-explorer/" target="_blank" rel="noopener"
&gt;Stanford Question Answering Dataset
(SQuAD)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Stanford-Question-Answering-Dataset-SQuAD.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.psych.ualberta.ca/~westburylab/downloads/usenetcorpus.download.html" target="_blank" rel="noopener"
&gt;USENET postings corpus of
2005~2011&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/USENET-postings-corpus-of-2005~2011.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://universaldependencies.org" target="_blank" rel="noopener"
&gt;Universal Dependencies&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Universal-Dependencies.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://webhose.io/datasets" target="_blank" rel="noopener"
&gt;Webhose - News/Blogs in multiple
languages&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Webhose.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.wikidata.org/wiki/Wikidata:Database_download" target="_blank" rel="noopener"
&gt;Wikidata - Wikipedia
databases&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Wikidata.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://code.google.com/p/wiki-links/downloads/list" target="_blank" rel="noopener"
&gt;Wikipedia Links data - 40 Million Entities in
Context&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Wikipedia-Links-data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://wordnet.princeton.edu/download/" target="_blank" rel="noopener"
&gt;WordNet databases and
tools&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/WordNet-databases-and-tools.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://wordbank.stanford.edu/" target="_blank" rel="noopener"
&gt;Wordbank - Open, de-identified database of vocabulary development
from 84,138 children and \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Wordbank.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cognitiveai.org/explanationbank" target="_blank" rel="noopener"
&gt;WorldTree Corpus of Explanation Graphs for Elementary Science
Questions - a corpus of
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//NaturalLanguage/Worldtree-Explanation-Corpus.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="neuroscience"&gt;Neuroscience
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.brain-map.org/" target="_blank" rel="noopener"
&gt;Allen Institute Datasets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/Allen-Institute-Datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://braincatalogue.org/" target="_blank" rel="noopener"
&gt;Brain Catalogue&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/Brain-Catalogue.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://brainomics.cea.fr/localizer" target="_blank" rel="noopener"
&gt;Brainomics&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/Brainomics.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://datasets.codeneuro.org/" target="_blank" rel="noopener"
&gt;CodeNeuro Datasets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/CodeNeuro-Datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://crcns.org/data-sets" target="_blank" rel="noopener"
&gt;Collaborative Research in Computational Neuroscience
(CRCNS)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/Collaborative-Research-in-Computational-Neuroscience-CRCNS.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://fcon_1000.projects.nitrc.org/index.html" target="_blank" rel="noopener"
&gt;FCP-INDI&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/FCP-INDI.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.humanconnectome.org/data/" target="_blank" rel="noopener"
&gt;Human Connectome Project&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/Human-Connectome-Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://ndar.nih.gov/" target="_blank" rel="noopener"
&gt;NDAR&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/NDAR.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://data-archive.nimh.nih.gov/" target="_blank" rel="noopener"
&gt;NIMH Data Archive&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/NIMH-Data-Archive.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://neurodata.io" target="_blank" rel="noopener"
&gt;NeuroData&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/NeuroData.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://neuromorpho.org/" target="_blank" rel="noopener"
&gt;NeuroMorpho - NeuroMorpho.Org is a centrally curated inventory of
digitally reconstructed \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/NeuroMorpho.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://neuroelectro.org/" target="_blank" rel="noopener"
&gt;Neuroelectro&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/Neuroelectro.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.oasis-brains.org/" target="_blank" rel="noopener"
&gt;OASIS&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/OASIS.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://openneuro.org/public/datasets" target="_blank" rel="noopener"
&gt;OpenNEURO&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/OpenNEURO)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://openfmri.org/" target="_blank" rel="noopener"
&gt;OpenfMRI&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/OpenfMRI.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://studyforrest.org" target="_blank" rel="noopener"
&gt;Study Forrest&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/Study-Forrest.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://gigadb.org/dataset/100990" target="_blank" rel="noopener"
&gt;The Nencki-Symfonia EEG/ERP dataset - A high-density
electroencephalography (EEG) dataset
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Neuroscience/The_Nencki-Symfonia_EEG_ERP_dataset.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="physics"&gt;Physics
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://opendata.cern.ch/" target="_blank" rel="noopener"
&gt;CERN Open Data Portal&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Physics/CERN-Open-Data-Portal.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.crystallography.net/" target="_blank" rel="noopener"
&gt;Crystallography Open Database&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Physics/Crystallography-Open-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://icecube.wisc.edu/science/data" target="_blank" rel="noopener"
&gt;IceCube - South Pole Neutrino
Observatory&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Physics/IceCube.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://losc.ligo.org" target="_blank" rel="noopener"
&gt;Ligo Open Science Center (LOSC) - Gravitational wave data from the
LIGO Hanford and \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Physics/LIGO-Open-Science-Center.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://exoplanetarchive.ipac.caltech.edu/" target="_blank" rel="noopener"
&gt;NASA Exoplanet Archive&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Physics/NASA-Exoplanet-Archive.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://nssdc.gsfc.nasa.gov/nssdc/obtaining_data.html" target="_blank" rel="noopener"
&gt;NSSDC (NASA) data of 550 space
spacecraft&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Physics/NSSDC-NASA-data-of-550-space-spacecraft.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://doi.org/10.4224/PhysRevA.96.042113.data" target="_blank" rel="noopener"
&gt;Quantum simulations of an electron in a two dimensional potential
well - The data was
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Physics/Quantum.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.sdss.org/" target="_blank" rel="noopener"
&gt;Sloan Digital Sky Survey (SDSS) - Mapping the
Universe&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Physics/Sloan-Digital-Sky-Survey-SDSS.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="prostatecancer"&gt;ProstateCancer
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://dcc.icgc.org/projects/EOPC-DE" target="_blank" rel="noopener"
&gt;EOPC-DE-Early-Onset-Prostate-Cancer-Germany - Early Onset Prostate
Cancer - Germany. \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/EOPC-DE-Early-Onset-Prostate-Cancer-Germany.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.synapse.org/genie" target="_blank" rel="noopener"
&gt;GENIE - Data from the Genomics Evidence Neoplasia Information
Exchange (GENIE) project of the
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/GENIE.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=prad_cpcg_2017" target="_blank" rel="noopener"
&gt;Genomic-Hallmarks-Prostate-Adenocarcinoma-CPC-GENE - Comprehensive
genomic profiling of 477
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Genomic-Hallmarks-Prostate-Adenocarcinoma-CPC-GENE.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=prad_mskcc_2017" target="_blank" rel="noopener"
&gt;MSK-IMPACT-Clinical-Sequencing-Cohort-MSKCC-Prostate-Cancer -
Targeted sequencing of clinical
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/MSK-IMPACT-Clinical-Sequencing-Cohort-MSKCC-Prostate-Cancer.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=prad_mich" target="_blank" rel="noopener"
&gt;Metastatic-Prostate-Adenocarcinoma-MCTP - Comprehensive profiling
of 61 prostate cancer
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Metastatic-Prostate-Adenocarcinoma-MCTP.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=prad_su2c_2015" target="_blank" rel="noopener"
&gt;Metastatic-Prostate-Cancer-SU2CPCF-Dream-Team - Comprehensive
analysis of 150 metastatic
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Metastatic-Prostate-Cancer-SU2CPCF-Dream-Team.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.cdc.gov/cancer/uscs/public-use" target="_blank" rel="noopener"
&gt;NPCR-2001-2015 - Database from CDC's National Program of Cancer
Registries (NPCR). The
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/NPCR-2001-2015.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.cdc.gov/cancer/uscs/public-use" target="_blank" rel="noopener"
&gt;NPCR-2005-2015 - Database from CDC's National Program of Cancer
Registries (NPCR). The
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/NPCR-2005-2015.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://wiki.cancerimagingarchive.net/display/Public/NaF&amp;#43;Prostate" target="_blank" rel="noopener"
&gt;NaF-Prostate - NaF Prostate is a collection of F-18 NaF positron
emission tomography/computed
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/NaF-Prostate.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=nepc_wcm_2016" target="_blank" rel="noopener"
&gt;Neuroendocrine-Prostate-Cancer - Whole exome and RNA Seq data of
castration resistant
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Neuroendocrine-Prostate-Cancer.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://biometry.nci.nih.gov/cdas/plco/" target="_blank" rel="noopener"
&gt;PLCO-Prostate-Diagnostic-Procedures - The Prostate Diagnostic
Procedures dataset (95,837
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/PLCO-Prostate-Diagnostic-Procedures.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://biometry.nci.nih.gov/cdas/plco/" target="_blank" rel="noopener"
&gt;PLCO-Prostate-Medical-Complications - The Prostate Medical
Complications dataset (3,350
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/PLCO-Prostate-Medical-Complications.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://biometry.nci.nih.gov/cdas/plco/" target="_blank" rel="noopener"
&gt;PLCO-Prostate-Screening-Abnormalities - The Prostate Screening
Abnormalities dataset (10,527
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/PLCO-Prostate-Screening-Abnormalities.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://biometry.nci.nih.gov/cdas/plco/" target="_blank" rel="noopener"
&gt;PLCO-Prostate-Screening - The Prostate Screening dataset (177,315
records, 35,875 subjects,
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/PLCO-Prostate-Screening.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://biometry.nci.nih.gov/cdas/plco/" target="_blank" rel="noopener"
&gt;PLCO-Prostate-Treatments - The Prostate Treatments dataset (13,409
records, 7,614 subjects,
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/PLCO-Prostate-Treatments.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://biometry.nci.nih.gov/cdas/plco/" target="_blank" rel="noopener"
&gt;PLCO-Prostate - The Prostate dataset is a comprehensive dataset
that contains nearly all the
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/PLCO-Prostate.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://dcc.icgc.org/projects/PRAD-CA" target="_blank" rel="noopener"
&gt;PRAD-CA-Prostate-Adenocarcinoma-Canada - Prostate Adenocarcinoma -
Canada. Collected by the
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/PRAD-CA-Prostate-Adenocarcinoma-Canada.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://dcc.icgc.org/projects/PRAD-FR" target="_blank" rel="noopener"
&gt;PRAD-FR-Prostate-Adenocarcinoma-France - Prostate Adenocarcinoma -
France. Collected by ten
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/PRAD-FR-Prostate-Adenocarcinoma-France.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://dcc.icgc.org/projects/PRAD-UK" target="_blank" rel="noopener"
&gt;PRAD-UK-Prostate-Adenocarcinoma-United-Kingdom - Prostate
Adenocarcinoma - United Kingdom.
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/PRAD-UK-Prostate-Adenocarcinoma-United-Kingdom.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://wiki.cancerimagingarchive.net/display/Public/SPIE-AAPM-NCI&amp;#43;PROSTATEx&amp;#43;Challenges" target="_blank" rel="noopener"
&gt;PROSTATEx-Challenge - Retrospective set of prostate MR studies. All
studies included
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/PROSTATEx-Challenge.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://wiki.cancerimagingarchive.net/display/Public/PROSTATE-3T" target="_blank" rel="noopener"
&gt;Prostate-3T - The Prostate-3T project provided imaging data to TCIA
as part of an ISBI
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Prostate-3T.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=prad_broad" target="_blank" rel="noopener"
&gt;Prostate-Adenocarcinoma-Broad-Cornell-2012 - Comprehensive
profiling of 112 prostate cancer
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Prostate-Adenocarcinoma-Broad-Cornell-2012.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=prad_broad_2013" target="_blank" rel="noopener"
&gt;Prostate-Adenocarcinoma-Broad-Cornell-2013 - Comprehensive
profiling of 57 prostate cancer
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Prostate-Adenocarcinoma-Broad-Cornell-2013.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=prad_mskcc_2014" target="_blank" rel="noopener"
&gt;Prostate-Adenocarcinoma-CNA-study-MSKCC - Copy-number profiling of
103 primary prostate
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Prostate-Adenocarcinoma-CNA-study-MSKCC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=prad_fhcrc" target="_blank" rel="noopener"
&gt;Prostate-Adenocarcinoma-Fred-Hutchinson-CRC - Comprehensive
profiling of prostate cancer
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Prostate-Adenocarcinoma-Fred-Hutchinson-CRC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=prad_p1000" target="_blank" rel="noopener"
&gt;Prostate Adenocarcinoma (MSKCC/DFCI) - Whole Exome Sequencing of
1013 prostate cancer
samples.&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Prostate-Adenocarcinoma-MSKCC-DFCI.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=prad_mskcc" target="_blank" rel="noopener"
&gt;Prostate-Adenocarcinoma-MSKCC - MSKCC Prostate Oncogenome Project.
181 primary, 37 metastatic
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Prostate-Adenocarcinoma-MSKCC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=prad_mskcc_cheny1_organoids_2014" target="_blank" rel="noopener"
&gt;Prostate-Adenocarcinoma-Organoids-MSKCC - Exome profiling of
prostate cancer samples and
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Prostate-Adenocarcinoma-Organoids-MSKCC.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=prad_eururol_2017" target="_blank" rel="noopener"
&gt;Prostate-Adenocarcinoma-Sun-Lab - Whole-genome and Transcriptome
Sequencing of 65 Prostate
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Prostate-Adenocarcinoma-Sun-Lab.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=prad_tcga_pan_can_atlas_2018" target="_blank" rel="noopener"
&gt;Prostate-Adenocarcinoma-TCGA-PanCancer-Atlas - Comprehensive TCGA
PanCanAtlas data from 11k
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Prostate-Adenocarcinoma-TCGA-PanCancer-Atlas.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=prad_tcga_pub" target="_blank" rel="noopener"
&gt;Prostate-Adenocarcinoma-TCGA - Integrated profiling of 333 primary
prostate adenocarcinoma
samples.&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Prostate-Adenocarcinoma-TCGA.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://wiki.cancerimagingarchive.net/display/Public/PROSTATE-DIAGNOSIS" target="_blank" rel="noopener"
&gt;Prostate-Diagnosis - PCa T1- and T2-weighted magnetic resonance
images (MRIs) were acquired
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Prostate-Diagnosis.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://wiki.cancerimagingarchive.net/display/Public/Prostate&amp;#43;Fused-MRI-Pathology" target="_blank" rel="noopener"
&gt;Prostate-Fused-MRI-Pathology - The Prostate Fused-MRI-Pathology
collection is a combination
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Prostate-Fused-MRI-Pathology.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://wiki.cancerimagingarchive.net/display/Public/Prostate-MRI" target="_blank" rel="noopener"
&gt;Prostate-MRI - The Prostate-MRI collection of prostate Magnetic
Resonance Images (MRIs) was
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Prostate-MRI.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://web.stanford.edu/~hastie/ElemStatLearn/datasets/prostate.data" target="_blank" rel="noopener"
&gt;Prostate-R - The R package 'ElemStatLearn' contains a prostate
cancer dataset from Stamey et
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/Prostate-R.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://wiki.cancerimagingarchive.net/display/Public/QIN-PROSTATE-Repeatability" target="_blank" rel="noopener"
&gt;QIN-PROSTATE-Repeatability - The QIN-PROSTATE-Repeatability dataset
is a dataset with
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/QIN-PROSTATE-Repeatability.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://wiki.cancerimagingarchive.net/display/Public/QIN&amp;#43;PROSTATE" target="_blank" rel="noopener"
&gt;QIN-PROSTATE - The QIN PROSTATE collection of the Quantitative
Imaging Network (QIN) contains
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/QIN-PROSTATE.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://seer.cancer.gov/data/seerstat/nov2017/" target="_blank" rel="noopener"
&gt;SEER-YR1973_2015.SEER9 - The SEER November 2017 Research Data files
from nine SEER registries
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/SEER-YR1973_2015.SEER9.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://seer.cancer.gov/data/seerstat/nov2017/" target="_blank" rel="noopener"
&gt;SEER-YR1992_2015.SJ_LA_RG_AK - The SEER November 2017 Research Data
files from the San Jose-
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/SEER-YR1992_2015.SJ_LA_RG_AK.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://seer.cancer.gov/data/seerstat/nov2017/" target="_blank" rel="noopener"
&gt;SEER-YR2000_2015.CA_KY_LO_NJ_GA - The SEER November 2017 Research
Data files from the Greater
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/SEER-YR2000_2015.CA_KY_LO_NJ_GA.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://seer.cancer.gov/data/seerstat/nov2017/" target="_blank" rel="noopener"
&gt;SEER-YR2000_2015.CA_KY_LO_NJ_GA - The July - December 2005
diagnoses for Louisiana from their
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/SEER-YR2005.LO_2ND_HALF.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cbioportal.org/study?id=prad_tcga" target="_blank" rel="noopener"
&gt;TCGA-PRAD-US - TCGA Prostate Adenocarcinoma (499
samples).&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//ProstateCancer/TCGA-PRAD-US.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="psychologycognition"&gt;Psychology+Cognition
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.cmr.osu.edu/browse/datasets" target="_blank" rel="noopener"
&gt;OSU Cognitive Modeling Repository
Datasets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Psychology+Cognition/OSU-Cognitive-Modeling-Repository-Datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://nimh-dsst.github.io/OpenCogData/" target="_blank" rel="noopener"
&gt;Open Cognitive Science Data - Publicly available behavioral
datasets from across cognitive
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Psychology+Cognition/Open-Cognitive-Science-Data-Repository.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="publicdomains"&gt;PublicDomains
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.ably.io/hub/" target="_blank" rel="noopener"
&gt;Ably Open Realtime Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Ably.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://aws.amazon.com/datasets/" target="_blank" rel="noopener"
&gt;Amazon&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Amazon.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://archive.org/details/datasets" target="_blank" rel="noopener"
&gt;Archive.org Datasets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Archive.org-Datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.archive-it.org/explore?show=Collections" target="_blank" rel="noopener"
&gt;Archive-it from Internet
Archive&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Archive.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://lib.stat.cmu.edu/jasadata/" target="_blank" rel="noopener"
&gt;CMU JASA data archive&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/CMU-JASA-data-archive.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://lib.stat.cmu.edu/datasets/" target="_blank" rel="noopener"
&gt;CMU StatLab collections&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/CMU-StatLab-collections.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.world" target="_blank" rel="noopener"
&gt;Data.World&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Data.World.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.data360.org/index.aspx" target="_blank" rel="noopener"
&gt;Data360&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Data360.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://public.enigma.com/" target="_blank" rel="noopener"
&gt;Enigma Public&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Enigma-Public.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.google.com/publicdata/directory" target="_blank" rel="noopener"
&gt;Google&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Google.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.comics.org" target="_blank" rel="noopener"
&gt;Grand Comics Database - The Grand Comics Database (GCD) is a
nonprofit, internet-based \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/GrandComics.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.infochimps.com/" target="_blank" rel="noopener"
&gt;Infochimps&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Infochimps.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.kdnuggets.com/datasets/index.html" target="_blank" rel="noopener"
&gt;KDNuggets Data
Collections&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/KDNuggets-Data-Collections.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://azuremarketplace.microsoft.com/en-us/marketplace/apps?source=datamarket&amp;amp;filters=pricing-free&amp;amp;page=1" target="_blank" rel="noopener"
&gt;Microsoft Azure Data Market Free
DataSets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Microsoft-Azure-Data-Market-Free-DataSets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://aka.ms/Data-Science" target="_blank" rel="noopener"
&gt;Microsoft Data Science for Research&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Microsoft-Data-Science-for-Research.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://msropendata.com/" target="_blank" rel="noopener"
&gt;Microsoft Research Open Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Microsoft-Research-Open-Data)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://openlibrary.org/developers/dumps" target="_blank" rel="noopener"
&gt;Open Library Data Dumps&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Open-Library-Data-Dumps.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.reddit.com/r/datasets" target="_blank" rel="noopener"
&gt;Reddit Datasets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Reddit-Datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://packages.revolutionanalytics.com/datasets/" target="_blank" rel="noopener"
&gt;RevolutionAnalytics
Collection&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/RevolutionAnalytics-Collection.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://stat.ethz.ch/R-manual/R-patched/library/datasets/html/00Index.html" target="_blank" rel="noopener"
&gt;Sample R data
sets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Sample-R-data-sets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://insights.stackoverflow.com/survey" target="_blank" rel="noopener"
&gt;Stack Overflow Annual Developer Survey - Annual developer surverys
full data sets from 2011
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Stack-Overflow-Annual-Developer-Survey.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.statsci.org/datasets.html" target="_blank" rel="noopener"
&gt;StatSci.org&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/StatSci.org.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://web.archive.org/web/20151024082129/http://www.stats4stem.org:80/data-sets.html" target="_blank" rel="noopener"
&gt;Stats4Stem R data sets
(archived)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Stats4Stem-R-data-sets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.washingtonpost.com/wp-srv/metro/data/datapost.html" target="_blank" rel="noopener"
&gt;The Washington Post
List&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/The-Washington-Post-List.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://wiki.stat.ucla.edu/socr/index.php/SOCR_Data" target="_blank" rel="noopener"
&gt;UCLA SOCR data
collection&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/UCLA-SOCR-data-collection.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.nuforc.org/webreports.html" target="_blank" rel="noopener"
&gt;UFO Reports&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/UFO-Reports.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://911.wikileaks.org/files/index.html" target="_blank" rel="noopener"
&gt;Wikileaks 911 pager
intercepts&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Wikileaks-911-pager-intercepts.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://webscope.sandbox.yahoo.com/catalog.php" target="_blank" rel="noopener"
&gt;Yahoo Webscope&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//PublicDomains/Yahoo-Webscope.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="searchengines"&gt;SearchEngines
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://academictorrents.com/" target="_blank" rel="noopener"
&gt;Academic Torrents of data sharing from
UMB&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SearchEngines/Academic-Torrents-of-data-sharing-from-UMB.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://basedosdados.org/en" target="_blank" rel="noopener"
&gt;Base dos Dados - Data Basis: Open Data Repository for
Brazil&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SearchEngines/BaseDosDados.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://datahub.io/dataset" target="_blank" rel="noopener"
&gt;Datahub.io&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SearchEngines/Datahub.io.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/tb0hdan/domains" target="_blank" rel="noopener"
&gt;Domains Project - Sorted list of Internet
domains&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SearchEngines/DomainsProject.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://dataverse.harvard.edu/" target="_blank" rel="noopener"
&gt;Harvard Dataverse Network of scientific
data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SearchEngines/Harvard-Dataverse-Network-of-scientific-data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.icpsr.umich.edu/web/pages/ICPSR/index.html" target="_blank" rel="noopener"
&gt;ICPSR
(UMICH)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SearchEngines/ICPSR-UMICH.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://eric.ed.gov" target="_blank" rel="noopener"
&gt;Institute of Education Sciences&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SearchEngines/Institute-of-Education-Sciences.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://ntrl.ntis.gov/NTRL/" target="_blank" rel="noopener"
&gt;National Technical Reports Library&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SearchEngines/National-Technical-Reports-Library.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://certificates.theodi.org/en/datasets" target="_blank" rel="noopener"
&gt;Open Data Certificates
(beta)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SearchEngines/Open-Data-Certificates-beta.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.opendatanetwork.com/" target="_blank" rel="noopener"
&gt;OpenDataNetwork - A search engine of all Socrata powered data
portals&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SearchEngines/OpenDataNetwork.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.statista.com/" target="_blank" rel="noopener"
&gt;Statista.com - statistics and Studies&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SearchEngines/Statista.com.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://zenodo.org/collection/datasets" target="_blank" rel="noopener"
&gt;Zenodo - An open dependable home for the long-tail of
science&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SearchEngines/Zenodo.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="socialnetworks"&gt;SocialNetworks
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/msramalho/election-watch/blob/master/datasets/01_portuguese_presidential_elections_2021_01_24.md" target="_blank" rel="noopener"
&gt;2021 Portuguese Elections Twitter Dataset - 57M+ tweets, 1M+
users - This dataset contains
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/2021_Portuguese_Elections_Twitter_Dataset_57M_tweets_1M_users.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://waxy.org/random/misc/gamergate_tweets.csv" target="_blank" rel="noopener"
&gt;72 hours #gamergate Twitter
Scrape&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/72-hours-gamergate-Twitter-Scrape.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cs.cmu.edu/~enron/" target="_blank" rel="noopener"
&gt;CMU Enron Email of 150 users&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/CMU-Enron-Email-of-150-users.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://archive.org/details/twitter_cikm_2010" target="_blank" rel="noopener"
&gt;Cheng-Caverlee-Lee September 2009 - January 2010 Twitter
Scrape&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Cheng-Caverlee-Lee-Twitter-Scrape-September-2009~January-2010.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://projects.iq.harvard.edu/cbdb" target="_blank" rel="noopener"
&gt;China Biographical Database - The China Biographical Database is a
freely accessible \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/China-Biographical-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.kaggle.com/johntukey/clubhouse-dataset" target="_blank" rel="noopener"
&gt;Clubhouse
Dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Clubhouse-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://zenodo.org/record/3723940" target="_blank" rel="noopener"
&gt;A Twitter Dataset of 40+ million tweets related to COVID-19 - Due
to the relevance of the \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Covid19-40-Million-Tweets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://pikaso.me/blog/trump-twitter-archive" target="_blank" rel="noopener"
&gt;43k+ Donald Trump Twitter Screenshots - This archive contains
screenshots of 43,475 Donald
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Donald-Trump-Twitter-Screenshots.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://aws.amazon.com/datasets/enron-email-data/" target="_blank" rel="noopener"
&gt;EDRM Enron EMail of 151 users, hosted on
S3&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/EDRM-Enron-EMail-of-151-users-hosted-on-S3.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://archive.org/details/oxford-2005-facebook-matrix" target="_blank" rel="noopener"
&gt;Facebook Data Scrape
(2005)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Facebook-Data-Scrape-2005.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.humdata.org/dataset/social-connectedness-index" target="_blank" rel="noopener"
&gt;Facebook Social Connectedness Index - We use an anonymized snapshot
of all active Facebook
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Facebook-Social-Connectedness-Index.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://law.di.unimi.it/datasets.php" target="_blank" rel="noopener"
&gt;Facebook Social Networks from LAW (since
2007)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Facebook-Social-Networks-from-LAW-since-2007.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://archive.org/details/201309_foursquare_dataset_umn" target="_blank" rel="noopener"
&gt;Foursquare from UMN/Sarwat
(2013)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Foursquare-from-UMN-Sarwat-2013.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.gharchive.org/" target="_blank" rel="noopener"
&gt;GitHub Collaboration Archive&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/GitHub-Collaboration-Archive.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://web.archive.org/web/20190522043016/http://www3.cs.stonybrook.edu/~leman/data/gscholar.db" target="_blank" rel="noopener"
&gt;Google Scholar citation
relations&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Google-Scholar-citation-relations.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.sociopatterns.org/datasets/" target="_blank" rel="noopener"
&gt;High-Resolution Contact Networks from Wearable
Sensors&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/High-Resolution-Contact-Networks-from-Wearable-Sensors.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.indiemap.org/" target="_blank" rel="noopener"
&gt;Indie Map: social graph and crawl of top IndieWeb
sites&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Indie-Map.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://kdl.cs.umass.edu/display/public/Mobile&amp;#43;Social&amp;#43;Networks" target="_blank" rel="noopener"
&gt;Mobile Social Networks from
UMASS&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Mobile-Social-Networks-from-UMASS.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://snap.stanford.edu/data/higgs-twitter.html" target="_blank" rel="noopener"
&gt;Network Twitter
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Network-Twitter-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://files.pushshift.io/reddit/comments/" target="_blank" rel="noopener"
&gt;Reddit Comments&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Reddit-Comments.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/quankiquanki/skytrax-reviews-dataset" target="_blank" rel="noopener"
&gt;Skytrax' Air Travel Reviews
Dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Skytrax-Air-Travel-Reviews-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://snap.stanford.edu/data/egonets-Twitter.html" target="_blank" rel="noopener"
&gt;Social Twitter
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Social-Twitter-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www3.nd.edu/~oss/Data/data.html" target="_blank" rel="noopener"
&gt;SourceForge.net Research
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/SourceForge.net-Research-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://socialgrep.com/datasets/the-reddit-covid-dataset" target="_blank" rel="noopener"
&gt;The Reddit COVID dataset - This dataset attempts to capture the
full extent of COVID-19
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/The-Reddit-COVID-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.kaggle.com/aayushmishra1512/twitchdata" target="_blank" rel="noopener"
&gt;Twitch Top Streamer's
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/TwitchTopStreamers.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://nlp.uned.es/replab2013/" target="_blank" rel="noopener"
&gt;Twitter Data for Online Reputation
Management&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Twitter-Data-for-Online-Reputation-Management.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://help.sentiment140.com/for-students/" target="_blank" rel="noopener"
&gt;Twitter Data for Sentiment
Analysis&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Twitter-Data-for-Sentiment-Analysis.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://an.kaist.ac.kr/traces/WWW2010.html" target="_blank" rel="noopener"
&gt;Twitter Graph of entire Twitter
site&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Twitter-Graph-of-entire-Twitter-site.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://archive.org/details/2011-05-calufa-twitter-sql" target="_blank" rel="noopener"
&gt;Twitter Scrape Calufa May
2011&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Twitter-Scrape-Calufa-May-2011.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://law.di.unimi.it/datasets.php" target="_blank" rel="noopener"
&gt;UNIMI/LAW Social Network
Datasets&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/UNIMILAW-Social-Network-Datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/alexlitel/congresstweets" target="_blank" rel="noopener"
&gt;United States Congress Twitter Data - Daily datasets with tweets of
1100+ accounts associated
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/United-States-Congressional-Twitter-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://webscope.sandbox.yahoo.com/catalog.php?datatype=g" target="_blank" rel="noopener"
&gt;Yahoo! Graph and Social
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Yahoo-Graph-and-Social-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://netsg.cs.sfu.ca/youtubedata/" target="_blank" rel="noopener"
&gt;Youtube Video Social Graph in
2007,2008&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialNetworks/Youtube-Video-Social-Graph-in-2007~2008.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="socialsciences"&gt;SocialSciences
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.acleddata.com/" target="_blank" rel="noopener"
&gt;ACLED (Armed Conflict Location &amp;amp; Event Data
Project)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/ACLED.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/QZ9BSA" target="_blank" rel="noopener"
&gt;Authoritarian Ruling Elites Database - The Authoritarian Ruling
Elites Database (ARED) is a
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Authoritarian-Ruling-Elites.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.canlii.org/en/index.php" target="_blank" rel="noopener"
&gt;Canadian Legal Information
Institute&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Canadian-Legal-Information-Institute.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.systemicpeace.org/" target="_blank" rel="noopener"
&gt;Center for Systemic Peace Datasets - Conflict Trends, Polities,
State Fragility, etc&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Center-for-Systemic-Peace-Datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.correlatesofwar.org/" target="_blank" rel="noopener"
&gt;Correlates of War Project&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Correlates-of-War-Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://cryptome.org" target="_blank" rel="noopener"
&gt;Cryptome Conspiracy Theory Items&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Cryptome-Conspiracy-Theory-Items.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.datacards.org/login/" target="_blank" rel="noopener"
&gt;Datacards&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Datacards.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.europeansocialsurvey.org/data/" target="_blank" rel="noopener"
&gt;European Social Survey&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/European-Social-Survey.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/emorisse/FBI-Hate-Crime-Statistics/tree/master/2013" target="_blank" rel="noopener"
&gt;FBI Hate Crime 2013 - aggregated
data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/FBI-Hate-Crime-2013.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://fragilestatesindex.org/" target="_blank" rel="noopener"
&gt;Fragile States Index&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Fragile-States-Index.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://gdeltproject.org/data.html" target="_blank" rel="noopener"
&gt;GDELT Global Events Database&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/GDELT-Global-Events-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://gss.norc.org" target="_blank" rel="noopener"
&gt;General Social Survey (GSS) since 1972&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/General-Social-Survey-GSS-since-1972.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.gesis.org/en/home/" target="_blank" rel="noopener"
&gt;German Social Survey&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/German-Social-Survey.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.globalreligiousfutures.org/" target="_blank" rel="noopener"
&gt;Global Religious Futures
Project&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Global-Religious-Futures-Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/jamesqo/gun-violence-data" target="_blank" rel="noopener"
&gt;Gun Violence Data - A comprehensive, accessible database that
contains records of over 260k
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Gun-Violence-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.humdata.org/" target="_blank" rel="noopener"
&gt;Humanitarian Data Exchange&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Humanitarian-Data-Exchange.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.inform-index.org/Results/Global" target="_blank" rel="noopener"
&gt;INFORM Index for Risk
Management&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/INFORM-Index-for-Risk-Management.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.ined.fr/en/" target="_blank" rel="noopener"
&gt;Institute for Demographic Studies&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Institute-for-Demographic-Studies.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.iadb.org/" target="_blank" rel="noopener"
&gt;Inter-American Development Bank Open Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Inter-American-Development-Bank-Open-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.princeton.edu/~ina/" target="_blank" rel="noopener"
&gt;International Networks Archive&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/International-Networks-Archive.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.issp.org" target="_blank" rel="noopener"
&gt;International Social Survey Program ISSP&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/International-Social-Survey-Program-ISSP.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.isacompendium.com/public/" target="_blank" rel="noopener"
&gt;International Studies Compendium
Project&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/International-Studies-Compendium-Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://jmcguire.faculty.wesleyan.edu/welcome/cross-national-data/" target="_blank" rel="noopener"
&gt;James McGuire Cross National
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/James-McGuire-Cross-National-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://realitycommons.media.mit.edu/realitymining.html" target="_blank" rel="noopener"
&gt;MIT Reality Mining
Dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/MIT-Reality-Mining-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://nsd.uib.no" target="_blank" rel="noopener"
&gt;MacroData Guide by Norsk samfunnsvitenskapelig
datatjeneste&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/MacroData-Guide-by-Norsk-samfunnsvitenskapelig-datatjeneste.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://dataverse.harvard.edu/dataverse/MMdata" target="_blank" rel="noopener"
&gt;Mass Mobilization Data Project - The Mass Mobilization (MM) data
are an effort to understand
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Mass-Mobilization-Data-Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://ma-graph.org" target="_blank" rel="noopener"
&gt;Microsoft Academic Knowledge Graph - The Microsoft Academic
Knowledge Graph is a large RDF \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Microsoft-Academic-Knowledge-Graph.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.ipums.org/" target="_blank" rel="noopener"
&gt;Minnesota Population Center&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Minnesota-Population-Center.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://gain.nd.edu/our-work/country-index/download-data/" target="_blank" rel="noopener"
&gt;Notre Dame Global Adaptation Index
(ND-GAIN)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Notre-Dame-Global-Adaptation-Index-NG-DAIN.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.police.uk/data/" target="_blank" rel="noopener"
&gt;Open Crime and Policing Data in England, Wales and Northern
Ireland&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Open-Crime-and-Policing-Data-in-England-Wales-and-Northern-Ireland.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.opensanctions.org/#downloads" target="_blank" rel="noopener"
&gt;OpenSanctions - A global database of persons and companies of
political, criminal, or
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/OpenSanctions.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.paulhensel.org/dataintl.html" target="_blank" rel="noopener"
&gt;Paul Hensel General International Data
Page&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Paul-Hensel-General-International-Data-Page.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.pewinternet.org/?post_type=dataset" target="_blank" rel="noopener"
&gt;PewResearch Internet Survey
Project&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/PewResearch-Internet-Survey-Project.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.pewresearch.org/data/download-datasets/" target="_blank" rel="noopener"
&gt;PewResearch Society Data
Collection&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/PewResearch-Society-Data-Collection.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www3.cs.stonybrook.edu/~leman/data/14-icwsm-political-polarity-data.zip" target="_blank" rel="noopener"
&gt;Political Polarity
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Political-Polarity-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://data.stackexchange.com/help" target="_blank" rel="noopener"
&gt;StackExchange Data Explorer&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/StackExchange-Data-Explorer.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.trackingterrorism.org/" target="_blank" rel="noopener"
&gt;Terrorism Research and Analysis
Consortium&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Terrorism-Research-and-Analysis-Consortium.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.tdcj.state.tx.us/death_row/dr_executed_offenders.html" target="_blank" rel="noopener"
&gt;Texas Inmates Executed Since
1984&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Texas-Inmates-Executed-Since-1984.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.kaggle.com/c/titanic/data" target="_blank" rel="noopener"
&gt;Titanic Survival Data Set&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Titanic-Survival-Data-Set.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://ucdata.berkeley.edu/" target="_blank" rel="noopener"
&gt;UCB's Archive of Social Science Data
(D-Lab)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/UCBs-Archive-of-Social-Science-Data-D-Lab.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://dataverse.harvard.edu/dataverse/ssda_ucla" target="_blank" rel="noopener"
&gt;UCLA Social Sciences Data
Archive&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/UCLA-Social-Sciences-Data-Archive.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://esango.un.org/civilsociety/" target="_blank" rel="noopener"
&gt;UN Civil Society Database&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/UN-Civil-Society-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.upjohn.org/services/resources/employment-research-data-center" target="_blank" rel="noopener"
&gt;UPJOHN for Labor Employment
Research&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/UPJOHN-for-Labor-Employment-Research.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://univ.cc/" target="_blank" rel="noopener"
&gt;Universities Worldwide&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Universities-Worldwide.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://ucdp.uu.se/" target="_blank" rel="noopener"
&gt;Uppsala Conflict Data Program&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/Uppsala-Conflict-Data-Program.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://data.worldbank.org/" target="_blank" rel="noopener"
&gt;World Bank Open Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/World-Bank-Open-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://wid.world" target="_blank" rel="noopener"
&gt;World Inequality Database - The World Inequality Database
(WID.world) aims to provide open \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/World-Inequality-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.worldpop.org.uk/data/get_data/" target="_blank" rel="noopener"
&gt;WorldPop project - Worldwide human population
distributions&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//SocialSciences/WorldPop-project.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="software"&gt;Software
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://flossdata.syr.edu/data/" target="_blank" rel="noopener"
&gt;FLOSSmole data about free, libre, and open source software
development&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Software/FLOSSmole-data-about-free-libre-and-open-source-software-development.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://ghtorrent.org" target="_blank" rel="noopener"
&gt;GHTorrent - Scalable, queryable, offline mirror of data offered
through the GitHub REST API.&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Software/GHTorrent.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://doi.org/10.5281/zenodo.1068916" target="_blank" rel="noopener"
&gt;Libraries.io Open Source Repository and Dependency
Metadata&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Software/Libraries.io-Open-Source-Repository-and-Dependency-Metadata.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/src-d/datasets/tree/master/PublicGitArchive" target="_blank" rel="noopener"
&gt;Public Git Archive - a Big Code dataset for all &amp;ndash; dataset of
182,014 top-bookmarked Git
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Software/source%7Bd%7D-Public-Git-Archive.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/src-d/datasets/tree/master/Duplicates" target="_blank" rel="noopener"
&gt;Code duplicates - 2k Java file and 600 Java function pairs labeled
as similar or different by
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Software/source%7Bd%7D-code-duplicates.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/src-d/datasets/blob/master/CommitMessages" target="_blank" rel="noopener"
&gt;Commit messages - 1.3 billion GitHub commit messages till March
2019&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Software/source%7Bd%7D-commit-messages.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/src-d/datasets/blob/master/ReviewComments" target="_blank" rel="noopener"
&gt;Pull Request review comments - 25.3 million GitHub PR review
comments since January 2015 till
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Software/source%7Bd%7D-pull-request-review-comments.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/src-d/datasets/tree/master/Identifiers" target="_blank" rel="noopener"
&gt;Source Code Identifiers - 41.7 million distinct splittable
identifiers collected from 182,014
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Software/source%7Bd%7D-source-code-identifiers.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="sports"&gt;Sports
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.world/ninja/anw-obstacle-history" target="_blank" rel="noopener"
&gt;American Ninja Warrior Obstacles - Contains every obstacle in the
history of American Ninja
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Sports/American-Ninja-Warrior-Obstacles.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://data.betfair.com/" target="_blank" rel="noopener"
&gt;Betfair Historical Exchange Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Sports/Betfair-Historical-Exchange-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://cricsheet.org/" target="_blank" rel="noopener"
&gt;Cricsheet Matches (cricket)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Sports/Cricsheet-Matches-cricket.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://ope.ed.gov/athletics" target="_blank" rel="noopener"
&gt;Equity in Athletics - The Equity in Athletics Data Analysis Cutting
Tool is brought to you by \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Sports/Equity-in-Athletics.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://ergast.com/mrd/db" target="_blank" rel="noopener"
&gt;Ergast Formula 1, from 1950 up to date
(API)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Sports/Ergast-Formula-1-from-1950-up-to-date-API.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.jokecamp.com/blog/guide-to-football-and-soccer-data-and-apis/" target="_blank" rel="noopener"
&gt;Football/Soccer resources (data and
APIs)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Sports/FootballSoccer-resources-data-and-APIs.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://sabr.org/lahman-database/" target="_blank" rel="noopener"
&gt;Lahman's Baseball Database&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Sports/Lahmans-Baseball-Database.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.dolthub.com/repositories/Liquidata/nfl-play-by-play" target="_blank" rel="noopener"
&gt;NFL play-by-play data - NFL play-by-play data sourced from:
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Sports/NFL-play-by-play.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/phillc73/pinhooker" target="_blank" rel="noopener"
&gt;Pinhooker: Thoroughbred Bloodstock Sale
Data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Sports/Pinhooker.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/ranganadhkodali/Pro-Kabadi-season-1-7-Stats" target="_blank" rel="noopener"
&gt;Pro Kabadi season 1 to 7 - Pro Kabadi League is a
professional-level Kabaddi league in India.
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Sports/Pro_Kabadi_season1_7.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.retrosheet.org/game.htm" target="_blank" rel="noopener"
&gt;Retrosheet Baseball Statistics&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Sports/Retrosheet-Baseball-Statistics.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/JeffSackmann/tennis_atp" target="_blank" rel="noopener"
&gt;Tennis database of rankings, results, and stats for
ATP&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Sports/Tennis-database-of-rankings-results-and-stats-for-ATP.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/JeffSackmann/tennis_wta" target="_blank" rel="noopener"
&gt;Tennis database of rankings, results, and stats for
WTA&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Sports/Tennis-database-of-rankings-results-and-stats-for-WTA.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/dcaribou/transfermarkt-datasets" target="_blank" rel="noopener"
&gt;Transfermarkt Datasets - Clean, structured and automatically
updated football (soccer) data
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Sports/Transfermarkt-Datasets.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/gavinr/usa-soccer" target="_blank" rel="noopener"
&gt;USA Soccer Teams and Locations - USA soccer teams and locations.
MLS, NWSL, and USL \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Sports/USA-Soccer.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="timeseries"&gt;TimeSeries
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/ricardovvargas/3w_dataset" target="_blank" rel="noopener"
&gt;3W dataset - To the best of its authors' knowledge, this is the
first realistic and public
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//TimeSeries/3W-dataset-rare-undesirable-real-events-in-oil-wells.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.cntsdata.com" target="_blank" rel="noopener"
&gt;Databanks International Cross National Time Series Data
Archive&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//TimeSeries/Databanks-International-Cross-National-Time-Series-Data-Archive.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.backblaze.com/hard-drive-test-data.html" target="_blank" rel="noopener"
&gt;Hard Drive Failure
Rates&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//TimeSeries/Hard-Drive-Failure-Rates.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://ecg.mit.edu/time-series/" target="_blank" rel="noopener"
&gt;Heart Rate Time Series from MIT&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//TimeSeries/Heart-Rate-Time-Series-from-MIT.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://pkg.yangzhuoranyang.com/tsdl/" target="_blank" rel="noopener"
&gt;Time Series Data Library (TSDL) from
MU&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//TimeSeries/Time-Series-Data-Library-TSDL-from-MU.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/alan-turing-institute/TCPD" target="_blank" rel="noopener"
&gt;Turing Change Point Dataset - Contains 42 annotated time series
collected for the development
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//TimeSeries/Turing-Change-Point-Dataset.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.cs.ucr.edu/~eamonn/time_series_data_2018/" target="_blank" rel="noopener"
&gt;UC Riverside Time Series
Dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//TimeSeries/UC-Riverside-Time-Series-Dataset.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="transportation"&gt;Transportation
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://doi.org/10.7910/DVN/HG7NV7" target="_blank" rel="noopener"
&gt;Airlines OD Data 1987-2008&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/Airlines-OD-Data-1987~2008.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.fordgobike.com/system-data" target="_blank" rel="noopener"
&gt;Ford GoBike Data (formerly Bay Area Bike Share
Data)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/Bay-Area-Bike-Share-Data.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/BetaNYC/Bike-Share-Data-Best-Practices/wiki/Bike-Share-Data-Systems" target="_blank" rel="noopener"
&gt;Bike Share Systems (BSS)
collection&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/Bike-Share-Systems-BSS-collection.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://opendata.ndw.nu/" target="_blank" rel="noopener"
&gt;Dutch Traffic Information&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/Dutch-Traffic-Information.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.microsoft.com/en-us/download/details.aspx?id=52367" target="_blank" rel="noopener"
&gt;GeoLife GPS Trajectory from Microsoft
Research&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/GeoLife-GPS-Trajectory-from-Microsoft-Research.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://data.deutschebahn.com/dataset.groups.datasets.html" target="_blank" rel="noopener"
&gt;German train system by Deutsche
Bahn&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/German-train-system-by-Deutsche-Bahn.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://hubwaydatachallenge.org/trip-history-data/" target="_blank" rel="noopener"
&gt;Hubway Million Rides in
MA&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/Hubway-Million-Rides-in-MA.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.melbourne.vic.gov.au/explore/dataset/pedestrian-counting-system-monthly-counts-per-hour/" target="_blank" rel="noopener"
&gt;Melbourne Pedestrian Counting - This dataset contains hourly
pedestrian counts since 2009
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/Melbourne-pedestrian-counting.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://montreal.bixi.com/en/open-data" target="_blank" rel="noopener"
&gt;Montreal BIXI Bike Share&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/Montreal-BIXI-Bike-Share.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www1.nyc.gov/site/tlc/about/tlc-trip-record-data.page" target="_blank" rel="noopener"
&gt;NYC Taxi Trip Data
2009-&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/NYC-Taxi-Trip-Data-2009.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://archive.org/details/nycTaxiTripData2013" target="_blank" rel="noopener"
&gt;NYC Taxi Trip Data 2013
(FOIA/FOILed)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/NYC-Taxi-Trip-Data-2013-FOIA-FOILed.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/fivethirtyeight/uber-tlc-foil-response" target="_blank" rel="noopener"
&gt;NYC Uber trip data April 2014 to September
2014&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/NYC-Uber-trip-data-April-2014-to-September-2014.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://github.com/graphhopper/open-traffic-collection" target="_blank" rel="noopener"
&gt;Open Traffic
collection&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/Open-Traffic-collection.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://openflights.org/data.php" target="_blank" rel="noopener"
&gt;OpenFlights - airport, airline and route
data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/OpenFlights.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.rideindego.com/stations/json/" target="_blank" rel="noopener"
&gt;Philadelphia Bike Share Stations
(JSON)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/Philadelphia-Bike-Share-Stations-JSON.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.planecrashinfo.com/database.htm" target="_blank" rel="noopener"
&gt;Plane Crash Database, since
1920&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/Plane-Crash-Database-since-1920.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.transtats.bts.gov/Tables.asp?DB_ID=120" target="_blank" rel="noopener"
&gt;RITA Airline On-Time Performance
data&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/RITA-Airline-On.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://www.transtats.bts.gov/DataIndex.asp" target="_blank" rel="noopener"
&gt;RITA/BTS transport data collection
(TranStat)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/RITA-BTS-transport-data-collection-TranStat.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://data.renfe.com" target="_blank" rel="noopener"
&gt;Renfe (Spanish National Railway Network)
dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/Spanish-train-system-by-Renfe.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.toronto.ca/city-government/data-research-maps/open-data/open-data-catalogue/#84045f23-7465-0892-8889-7b6f91049b29" target="_blank" rel="noopener"
&gt;Toronto Bike Share Stations (JSON and GBFS
files)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/Toronto-Bike-Share-Stations-XML-file.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://tfl.gov.uk/info-for/open-data-users/our-open-data" target="_blank" rel="noopener"
&gt;Transport for London
(TFL)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/Transport-for-London-TFL.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://www.cmap.illinois.gov/data/transportation/travel-tracker-survey" target="_blank" rel="noopener"
&gt;Travel Tracker Survey (TTS) for
Chicago&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/Travel-Tracker-Survey-TTS-for-Chicago.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://www.bts.gov/browse-statistical-products-and-data" target="_blank" rel="noopener"
&gt;U.S. Bureau of Transportation Statistics
(BTS)&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/U.S.-Bureau-of-Transportation-Statistics-BTS.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="http://academictorrents.com/details/a2ccf94bbb4af222bf8e69dad60a68a29f310d9a" target="_blank" rel="noopener"
&gt;U.S. Domestic Flights 1990 to
2009&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/U.S.-Domestic-Flights-1990-to-2009.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="http://ops.fhwa.dot.gov/freight/freight_analysis/faf/index.htm" target="_blank" rel="noopener"
&gt;U.S. Freight Analysis Framework since
2007&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/U.S.-Freight-Analysis-Framework-since-2007.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/fixme-24.png"
loading="lazy"
alt="FIXME_ICON"
&gt;
&lt;a class="link" href="https://nhtsa.gov/FARS/" target="_blank" rel="noopener"
&gt;U.S. National Highway Traffic Safety Administration - Fatalities
since 1975 - Contains CSV \[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//Transportation/U.S.-National-Highway-Traffic-Safety-Administation-Fatalities-since-1975.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="esports"&gt;eSports
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.kaggle.com/skihikingkevin/csgo-matchmaking-damage" target="_blank" rel="noopener"
&gt;CS:GO Competitive Matchmaking Data - In this data set we have data
about the CSGO matchmaking
\[\...\]&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//eSports/csgo.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://www.kaggle.com/aayushmishra1512/fifa-2021-complete-player-data" target="_blank" rel="noopener"
&gt;FIFA-2021 Complete Player
Dataset&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//eSports/fifa2021.yml)\]&lt;/li&gt;
&lt;li&gt;&lt;img src="https://raw.githubusercontent.com/awesomedata/apd-core/master/deploy/ok-24.png"
loading="lazy"
alt="OK_ICON"
&gt;
&lt;a class="link" href="https://blog.opendota.com/2017/03/24/datadump2/" target="_blank" rel="noopener"
&gt;OpenDota data
dump&lt;/a&gt;
\[[Meta](https://github.com/awesomedata/apd-core/tree/master/core//eSports/opendota-dump.yml)\]&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="complementary-collections"&gt;Complementary Collections
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/datasets/" target="_blank" rel="noopener"
&gt;Data Packaged Core Datasets&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;OpenDataMonitor: &lt;a class="link" href="https://opendatamonitor.eu/frontend/web/index.php?r=dashboard%2Findex" target="_blank" rel="noopener"
&gt;An overview of available open data resources in
Europe&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Quora: &lt;a class="link" href="https://www.quora.com/Where-can-I-find-large-datasets-open-to-the-public" target="_blank" rel="noopener"
&gt;Where can I find large datasets open to the
public?&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;RS.io: &lt;a class="link" href="https://rs.io/100-interesting-data-sets-for-statistics/" target="_blank" rel="noopener"
&gt;100+ Interesting Data Sets for
Statistics&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;CVonline: &lt;a class="link" href="https://homepages.inf.ed.ac.uk/rbf/CVonline/" target="_blank" rel="noopener"
&gt;Image
Databases&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;InnoTrek: &lt;a class="link" href="https://web.archive.org/web/20210427004644/http://caesar0301.github.io/posts/2014/10/23/leveraging-open-data-to-understand-urban-lives/" target="_blank" rel="noopener"
&gt;Leveraging open data to understand urban
lives&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;CV Papers: &lt;a class="link" href="https://web.archive.org/web/20180318042653/http://cvpapers.com/datasets.html" target="_blank" rel="noopener"
&gt;CV Datasets on the
web&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;</description></item></channel></rss>