We consider a dynamic mobile edge computing (MEC) network with multiple computational access points (CAPs) that serve user equipment (UEs). We assume that UEs could join or leave the network due to mobility, resulting in the dynamic change in the network topology. To fully exploit the computational resource in the MEC network, the offloading decision, transmission power, and the computational resource should be appropriately allocated, and a robust design that addresses the above issues is necessary. In this work, we propose a robust hierarchical learning approach that applies deep Q networks (DQNs) and deep neural networks (DNNs) at the UEs and CAPs, respectively. Each UE interacts with the network environment and learns the best offloading decision policy in a local scope. Via sharing the local best policy learned by the UEs to the CAPs, the CAPs learn the relation between the UE location and the locally best strategy. The proposed robust approach suppresses the peak in cost caused by the dynamic topology change by up to 160% compared with a non-robust algorithm in the simulation. This demonstrates the necessity and benefit of robust design in a more realistic and dynamic MEC network.