{"publication":"arXiv:2201.11343","date_updated":"2022-11-18T09:33:01Z","type":"preprint","language":[{"iso":"eng"}],"citation":{"bibtex":"@article{Redder_Ramaswamy_Karl_2022, title={Distributed gradient-based optimization in the presence of dependent  aperiodic communication}, journal={arXiv:2201.11343}, author={Redder, Adrian and Ramaswamy, Arunselvan and Karl, Holger}, year={2022} }","chicago":"Redder, Adrian, Arunselvan Ramaswamy, and Holger Karl. “Distributed Gradient-Based Optimization in the Presence of Dependent  Aperiodic Communication.” ArXiv:2201.11343, 2022.","short":"A. Redder, A. Ramaswamy, H. Karl, ArXiv:2201.11343 (2022).","mla":"Redder, Adrian, et al. “Distributed Gradient-Based Optimization in the Presence of Dependent  Aperiodic Communication.” ArXiv:2201.11343, 2022.","ama":"Redder A, Ramaswamy A, Karl H. Distributed gradient-based optimization in the presence of dependent  aperiodic communication. arXiv:220111343. Published online 2022.","ieee":"A. Redder, A. Ramaswamy, and H. Karl, “Distributed gradient-based optimization in the presence of dependent  aperiodic communication,” arXiv:2201.11343. 2022.","apa":"Redder, A., Ramaswamy, A., & Karl, H. (2022). Distributed gradient-based optimization in the presence of dependent  aperiodic communication. In arXiv:2201.11343."},"user_id":"477","external_id":{"arxiv":["2201.11343"]},"date_created":"2022-04-06T06:53:38Z","department":[{"_id":"75"}],"title":"Distributed gradient-based optimization in the presence of dependent aperiodic communication","_id":"30790","author":[{"first_name":"Adrian","orcid":"https://orcid.org/0000-0001-7391-4688","full_name":"Redder, Adrian","last_name":"Redder","id":"52265"},{"first_name":"Arunselvan","orcid":"https://orcid.org/ 0000-0001-7547-8111","id":"66937","last_name":"Ramaswamy","full_name":"Ramaswamy, Arunselvan"},{"first_name":"Holger","id":"126","last_name":"Karl","full_name":"Karl, Holger"}],"abstract":[{"text":"Iterative distributed optimization algorithms involve multiple agents that\r\ncommunicate with each other, over time, in order to minimize/maximize a global\r\nobjective. In the presence of unreliable communication networks, the\r\nAge-of-Information (AoI), which measures the freshness of data received, may be\r\nlarge and hence hinder algorithmic convergence. In this paper, we study the\r\nconvergence of general distributed gradient-based optimization algorithms in\r\nthe presence of communication that neither happens periodically nor at\r\nstochastically independent points in time. We show that convergence is\r\nguaranteed provided the random variables associated with the AoI processes are\r\nstochastically dominated by a random variable with finite first moment. This\r\nimproves on previous requirements of boundedness of more than the first moment.\r\nWe then introduce stochastically strongly connected (SSC) networks, a new\r\nstochastic form of strong connectedness for time-varying networks. We show: If\r\nfor any $p \\ge0$ the processes that describe the success of communication\r\nbetween agents in a SSC network are $\\alpha$-mixing with $n^{p-1}\\alpha(n)$\r\nsummable, then the associated AoI processes are stochastically dominated by a\r\nrandom variable with finite $p$-th moment. In combination with our first\r\ncontribution, this implies that distributed stochastic gradient descend\r\nconverges in the presence of AoI, if $\\alpha(n)$ is summable.","lang":"eng"}],"project":[{"_id":"16","name":"SFB 901 - C4: SFB 901 - Subproject C4"},{"name":"SFB 901: SFB 901","_id":"1"},{"_id":"4","name":"SFB 901 - C: SFB 901 - Project Area C"}],"status":"public","year":"2022"}