Capsule networks have been shown to be powerful models for image classiﬁcation, thanks to their ability to represent and capture viewpoint variations of an object. However, the high computational complexity of capsule networks that stems from the recurrent dynamic routing poses a major drawback making their use for largescale image classiﬁcation challenging. In this work, we propose STAR-CAPS a capsule-based network that exploits a straight-through attentive routing to address the drawbacks of capsule networks. By utilizing attention modules augmented by differentiable binary routers, the proposed mechanism estimates the routing coefﬁcients between capsules without recurrence, as opposed to prior related work. Subsequently, the routers utilize straight-through estimators to make binary decisions to either connect or disconnect the route between capsules, allowing stable and faster performance. The experiments conducted on several image classiﬁcation datasets, including MNIST, SmallNorb, CIFAR-10, CIFAR-100 and ImageNet show that STAR-CAPS outperforms the baseline capsule networks.