The LAM-PINN framework uses compositional meta-learning to solve partial differential equations across diverse task families. It replaces single global initializations with adaptive modules to prevent negative transfer. This approach lowers the computational cost of training Physics-Informed Neural Networks. Practitioners can now handle task heterogeneity without retraining individual models for every coefficient change.