Federated learning and analytics are a distributed approach for collaboratively learning models from decentralized data, motivated by and designed for privacy protection. The distributed learning process can be formulated as solving federated optimization problems, which emphasize communication efficiency, data heterogeneity, compatibility with privacy and system requirements, and other constraints that are not primary considerations in other problem settings.This monograph describes a novel optimization solution framework, called alternating gradient descent (GD) and minimization (AltGDmin), that is useful for many problems for which alternating minimization (AltMin) is a popular solution. AltMin is a special case of the block coordinate descent algorithm that is used for certain classes of problems for which minimization with regards to one subset of variables, keeping the other fixed in closed form or otherwise reliably solved, for example for bilinear problems.This set of problems includes those for which one of the two minimization steps of an AltMin iteration is partly decoupled. An optimization problem is decoupled if it can be solved by solving smaller dimensional, and hence quicker, problems over disjoint subsets of the optimization variable.