To solve this problem, we first need to define the decision variables and the objective function. Let's denote the number of acres of oats as \(O\) and the number of acres of flaxseed as \(F\). The profit per acre of oats is $500, and the profit per acre of flaxseed is $400. Therefore, the total profit can be represented as \(500O + 400F\).

The constraints are:
1. The farmer has 50 acres of land in total: \(O + F \leq 50\).
2. He must grow at least 5 acres of oats: \(O \geq 5\).
3. He must grow at least 8 acres of flaxseed: \(F \geq 8\).
4. The amount of oats cannot exceed twice the amount of flaxseed: \(O \leq 2F\).

Given these constraints, we aim to maximize the total profit.

Here is the Gurobi code to solve this linear programming problem:

```python
from gurobipy import *

# Create a new model
m = Model("Farmer_Optimization")

# Define variables
O = m.addVar(vtype=GRB.CONTINUOUS, name="oats", lb=0)
F = m.addVar(vtype=GRB.CONTINUOUS, name="flaxseed", lb=0)

# Set the objective function to maximize profit
m.setObjective(500*O + 400*F, GRB.MAXIMIZE)

# Add constraints
m.addConstr(O + F <= 50, "total_land")
m.addConstr(O >= 5, "min_oats")
m.addConstr(F >= 8, "min_flaxseed")
m.addConstr(O <= 2*F, "oats_vs_flaxseed")

# Optimize the model
m.optimize()

# Print the solution
if m.status == GRB.OPTIMAL:
    print("Optimal solution found:")
    print(f"Oats: {O.x} acres")
    print(f"Flaxseed: {F.x} acres")
    print(f"Maximum Profit: ${500*O.x + 400*F.x}")
else:
    print("No optimal solution found.")
```