This article presents a comprehensive dataset of labeled game situations obtained from multiple professional handball matches, which corresponds to the research paper entitled PlayNet: Real-time Handball Play Classification with Kalman Embeddings and Neural Networks. The dataset encompasses approximately 11 hours of footage from five handball games played in two different arenas, resulting in around 1 million data frames. Each frame has been meticulously labeled using seven distinct game situation classes (left and right attacks, left and right transitions, left and right penalties, and timeouts). Notably, the dataset does not contain video frames, but provides a synthetic normalized representation of each frame. This representation includes information about player, referee, and ball positions, as well as player and referee velocities, for every labeled game situation. We obtained said details automatically by using an object detector to infer the positions of players, referees, and the ball in each frame. After tracking the detected agent positions across frames, the extracted coordinates underwent normalization through a birds eye perspective transform, ensuring that the data remained unaffected by variations in camera configurations across different arenas. Finally, a Kalman filter was applied to improve the robustness of player positions and derive their velocities. The labeling process was performed by domain experts employing a custom system designed to annotate game situations, considering the play type and its contextual setting. In conclusion, researchers can utilize this dataset for several purposes: game analysis, automated broadcasting, or game summarization. Furthermore, this dataset can contribute to a broader understanding of the relationship between player dynamics and game situations, shedding light on the level of granularity required for accurately classifying them.