The ever-increasing number of devices connected to the Internet requires designing algorithms that make efficient use of the communication and computation resources of the network. In this thesis, we present algorithms that improve the utilization of the network for two timely applications: network edge caching and distributed model training. For network edge caching, we propose a polynomial-time algorithm with approximation guarantees that exploits recommendation engines to promote contents that are cached and thus improve the performance of the caching system. For distributed model training, we present new algorithms that trade communication complexity for faster convergence rates for two different frameworks:
decentralized optimization and federated learning. For the former, we propose an asynchronous algorithm based on dual optimization that exploits coordinate descent theory to reduce convergence time by letting the agents design their communication pattern as the optimization progresses.
For federated learning, we show that inter-agent communication can partially mitigate the negative effect of sparse server communication and that the speedup achieved depends directly on the connectivity of the inter-agent network.