<ul data-eligibleForWebStory="true">Graph combinatorial optimization (GCO) problems are crucial in domains like logistics and bioinformatics.Large language models (LLMs) are exploring new avenues for structured reasoning in GCO but have limitations with complex tasks.The Optimal Thoughts Design (OTD) problem is formalized to assist in producing high-quality intermediate reasoning steps.GraphThought is a new framework that generates effective reasoning sequences using either forward search or backward reasoning.Llama-GT, an 8B-parameter model developed through fine-tuning LLMs on structured thought sequences, excels in GCO tasks.It outperforms larger models like DeepSeek-V3, showcasing enhanced performance without the need for increased model scale.