Ensuring safety and providing obstacle conflict alerts to small unmanned aircraft is vital to their integration into civil airspace. There are many techniques for real-time robust drone guidance, but many of them need expensive computation time or large memory requirements, which is not applicable to deploy onboard of an aircraft with limited computation resources. To provide a safe and efficient computational guidance of operations for unmanned aircraft, we provide a framework using deep reinforcement learning algorithm to guide autonomous UAS to their destinations while avoiding the static and moving obstacles through continuous control. After offline training, the model only requires less than 100KB of memory, and the online computation for the conflict resolution advisory only takes 2ms. For the algorithm verification and validation, an airspace simulator is built in Python and numerical experiments show that the trained model can provide accurate and robust guidance with the environment uncertainty.