After Edward Snowden’s revelations, encryption of online communications at a large scale and in a usable manner has become a matter of public concern. The most advanced and popular among recently-developed encryption protocols is currently the Signal protocol. While the Signal protocol is widely adopted and considered as an improvement over previous ones, it remains officially unstandardised, even though there is an informal draft elaborated towards that goal. The analysis of how this protocol was introduced and swiftly adopted by various applications, and of subsequent transformations of the encrypted messaging ecosystem, sheds light on how a particular period in the history of secure messaging has been marked by a “de facto standardisation.” What can we learn about existing modes of governance of encryption and the histories of traditional standardisation bodies, when analysing the approach of “standardisation by running code” adopted by Signal? And finally, how does the Signal protocol challenge a “linear,” evolution-based vision of messaging history? Drawing from a three-year qualitative investigation of end-to-end encrypted messaging, from a perspective informed by science and technology studies (STS), we seek to unveil the ensemble of processes that make the Signal protocol a quasi-standard.