The LAM-PINN framework uses compositional meta-learning to solve partial differential equations across diverse task families. It replaces single global initializations with adaptive modules to prevent negative transfer when training data is scarce. This approach reduces the computational cost of retraining individual networks. Practitioners can now deploy physics-informed models across heterogeneous tasks with higher stability.