Split learning (SL) is a recently introduced distributed machine learning technique that enables training of a deep neural network without accessing the massively produced data on the edge devices. SL has a great potential for resource-constraint devices as the network is divided into chunks for the clients and the server complying with their computation powers. While this is lucrative for the clients to work on the smaller model, the server is held accountable for processing the dominant share of the neural network. This computation burden at the server-side becomes onerous especially when a considerable number of clients participate in a training round. Also, the communication requirement from the split client-side network becomes paramount with more data and large network sizes. Additionally, the large data size at the clients may not be equally important for the network training. Data selection is an innate approach to reduce the computation burden and revamp the performance of a neural network. However, existing data selection approaches are limited in the context of SL as the neural network is decentralized, distributed and the client data is also private. This work is the first attempt to consider activation selection in the framework of split neural networks such as SL. The proposed technique works by selecting uncertain activations generated at the client-side network in SL through a small auxiliary network trained on the client's data. The selected subset of these activations is then sent to the server for training the whole neural network. Extensive experimentation and empirical results corroborates that the proposed technique significantly reduces computation burden at the server end and the communication requirement between the server and the clients. Meanwhile, the low-level embeddings learned at the client-side with its local data contributes to better performance (+3\%) as compared to vanilla SL.