Real tools. Real standards. Real impact. Bigpicture from data to deployment.
As Bigpicture is in its fifth year, and the results are starting to speak for themselves. We’re reshaping the infrastructure for how digital pathology and AI tools are developed, validated and scaled. Through innovative software, legal innovation, standardisation, and collaboration, Bigpicture is showing what’s possible when Europe’s brightest minds come together.
“We’ve put something into motion that people have talked about for years, but now it’s real. That’s something to be proud of,” says Jeroen van der Laak, Bigpicture Coordinator and Professor of Computational Pathology at Radboudumc.
Standardisation as a foundation for everything
One of the big breakthroughs in Bigpicture is the Data Sharing Agreement (DSA), a legal framework signed by 44 partners that makes it possible to share medical data at scale. This agreement enables ethical, GDPR-compliant access to pathology data across countries and institutions. “That alone is quite extraordinary. It means we’ve mapped out and solved a huge number of potential legal and ethical hurdles,” says Jeroen. “Without something like this, we’d still be stuck with small datasets and isolated one-on-one contracts.”
Building on the DSA, the project is currently developing further agreements for non-beneficiaries, extending the impact of Bigpicture even further into the research ecosystem.
Bigpicture has also helped shape practical standards for digital pathology, such as DICOM compatibility, the Metadata Standard for aligning the annotation of datasets and AI models, and concrete protocols for quality control. Jeroen: “People often talk about standards, but we’re delivering the tools too. We’re not just saying ‘follow this model’, we’re giving people the software to implement it.”
Shaping new policies and education
Bigpicture’s impact reaches far beyond tools and data. The project plays an active role in shaping the future of the field through education and contributing to international standards.
One key contribution is to the European Master in Molecular Pathology program at the University of Nice, where Bigpicture partners introduce students to the fundamentals of AI in pathology. Students from across Europe take part in a six-week internship programme, gaining practical experience at partner institutions and building a European network of future-ready professionals. “So far, we’ve had students from all over Europe. It creates a network of young pathologists who stay connected and stay involved. That’s a really beautiful thing,” says Jeroen.
At the same time, Bigpicture is contributing to the development of a new ISO standard for digital pathology. This formal recognition of the project’s frameworks and workflows strengthens its long-term impact, ensuring that the methods and tools developed within Bigpicture become embedded in broader policy and practice.
A project that evolves with the field
Bigpicture’s infrastructure was never meant to be static. It’s designed to grow and respond to the needs of the field. A clear example is the work we’re doing to develop the first federated Bigpicture node; a major step in decentralised data access. Jeroen: “This is a great innovation. It means that parties that want to share their data without wanting to put it in a central repository can still make it available for research anywhere.” As data sovereignty and privacy requirements shift across Europe, Bigpicture’s architecture is adapting. From centralised hubs to the option of federated access, the project is evolving to meet new realities. Jeroen: “We’re always learning what works best. That’s the strength of this kind of collaboration.”
Changing the way we think
Perhaps the most profound innovation of Bigpicture isn’t a piece of code or a data pipeline, but a shift in culture. The project is helping to redefine what collaboration looks like in medical research. It's changing how institutions think about sharing, standards, and long-term impact. “Can we call it an innovation, preparing the landscape, changing the culture, making people think differently?” Jeroen reflects. “However we want to call our pioneering role in the health industry, it certainly is impact.”
Pioneering change is never smooth. It brings resistance, friction, and moments of uncertainty. Jeroen: “The bumps in the road aren’t signs of failure, they’re what come with being first.” But every agreement signed, every image shared, and every technical tool delivered, we get a step closer to Bigpicture’s vision. And this deeper transformation is already in motion: a new way of working; one that makes room for scale, speed, and openness in AI research.
Preparing the field for impact at scale
Another major step forward is the infrastructure for objective benchmarking and validation. With indirect access and validation tools now in place, the project addresses a growing challenge in biomedical AI. Jeroen: “When a pathologist has three prostate cancer algorithms to choose from, how do they know which one to use? Good benchmarks are critical, and far from obvious.”
And more is coming. As data uploads continue to grow, Bigpicture is getting closer to offering truly large-scale, high-quality datasets for training and validation. The vision ahead is bold, but tangible: a shared infrastructure that empowers researchers, protects data, and accelerates trustworthy innovation. And Bigpicture is delivering the building blocks, step by step.
Innovation in action: these tools make the difference
Bigpicture’s software innovations fall into several key categories, such as quality control, standardisation, and AI tooling. Jeroen: “These are not conceptual ideas or experimental pilots; they are real, concrete tools developed and available for real use. Their purpose is to make data sets easier to compile, curate, upload, to use the data for research and algoritm training, and to make collaboration across institutions technically feasible.”
These are some of the tools developed in Bigpicture and/or adapted to the platforms’ needs:
TANGO
This tool assesses image variability stemming from tissue staining and scanner settings. It helps labs gain insight into how changes in lab procedures affect image quality and downstream AI performance. TANGO looks at all the variables, from scanner settings and staining protocols to how often chemicals are refreshed. That all impacts the image and, ultimately, the AI model.
Learn more about TANGO
Artifact Segmentation Model
A machine learning model that flags quality issues such as tissue folds, image artifacts or out-of-focus regions, before slides are used in training AI models. This ensures that poor input doesn’t compromise model performance. Jeroen: “The artefact segmentation model is the quality check that’s essential for reliable AI.”
Standardisation tools and documents
Data Sharing Agreement (DSA)
A pioneering legal framework signed by 44 partners that enables GDPR-compliant data sharing across public and private institutions in Europe. It's the backbone of Bigpicture's large-scale collaboration efforts and sets a new standard for ethical data sharing.
Learn more about the DSA
The Data Transporter
A secure software tool that allows non-clinical data contributors to prepare, validate, and upload pathology datasets to the Bigpicture platform while maintaining full control over sensitive information. It ensures data is extracted, standardized, sensitive data is encoded, and encrypted before submission.
Learn more about the Data Transporter
Bigpicture’s Metadata Standard
A robust schema for describing pathology datasets in a consistent way. This enables reproducibility, dataset discovery, and interoperability. “We started with metadata for datasets—and now we’re expanding it to AI models. It’s all about making data exchangeable.”
SlideTap
This tool helps institutions get their local (meta)data into Bigpicture’s required format. It supports input from different local IT systems, Excel files, or manual entry and transforms this into clean, validated output. Jeroen: “People want to comply with standards—but often they can’t. SlideTap bridges that gap.”
Learn more about SlideTap
DICOM Formatting Tools
Tools to convert pathology images into the DICOM format, a global imaging standard. This supports interoperability between labs and removes one of the key blockers to AI development at scale. Jeroen: “DICOM has been discussed for 20 years. We’re now delivering the software to finally implement it.”
Learn more about Bigpicture’s DICOM tools
Cytomine Integration
A powerful open-source viewer and annotation tool integrated into the Bigpicture platform. Cytomine allows researchers to explore WSIs in-browser and even apply AI during the annotation process. Jeroen: “They’re now integrating AI into the frontend, and that’s really powerful. It enables live, AI-assisted annotation.”
AI tools
StreamingCLAM
A weakly supervised deep learning framework designed to handle large-scale WSI data. Jeroen: “It’s a smart way to train models on massive image data, even without detailed annotations.” StreamingCLAM processes slides in smaller patches (streams) rather than loading entire WSIs at once, while, in contrast to patch based methods, still maintaining the entire WSI context.
Learn more about StreamingCLAM
WSI Registration tool
The WSI Registration tool aligns multiple Whole Slide Images (WSIs) from the same sample, such as H&E and IHC stains, into a unified spatial reference. This is critical for analysis workflows that require slide-level comparison. Jeroen: “It enables us to look at multiple stainings side-by-side, crucial for model development and diagnostics.”
Learn more about WSI Registration
Content-Based Image Retrieval (CBIR)
This allows users to search Bigpicture datasets by visual similarity, rather than keyword. This unlocks new ways to explore data, find rare cases, or build training cohorts. Jeroen: “These tools will help users of the platform to build better models once the data is there. That’s a big step forward.”
AI Foundation Models
While Bigpicture is not directly building foundation models, its high-quality data and tools lay the groundwork for future AI-driven innovations. A Bigpicture Task Force is currently being established to explore opportunities to build AI models using data available through Bigpicture.
Learn more about Bigpicture’s Foundation Models
Latest news
10 March, 2026At the end of January 2026, members of the Bigpicture Consortium gathered in Berlin for their fifth...
Read more
08 January, 2026WP4 is finalizing a suite of tools, from content-based image retrieval to whole-slide image registration and AI-assisted...
Read more
08 January, 2026As Bigpicture prepares for its next Annual Meeting, momentum in the project is growing. Much of the...
Read more