I'm sure Dell will be along soon with figures and ratings but as I understand it, the larger the AWG rating the smaller the conductors cross section (1).
In short, the smaller the conductor the more resistance the conductor opposes to current flow (in amps). The more resistance in the conductor, the more voltage dropped along that conductor's length. In short, the more expensive AWG10 cable will give a smaller voltage drop along its length and thus charge the battery faster, than supplying current through the AWG12 cable.
If we think about it, adding a longer wire between the solar panel and the battery will add resistance to the connection between the solar panel and the battery. That resistance results in a voltage drop along the length of the wire, between the solar panel and the battery. This decreases the voltage seen across the battery terminals, as a small proportion of the charging voltage from the panel is lost along the length of the charging cable. If you have 10m of 12AWG cable, then you have ~0.052ohms (1) of resistance between the panel and the battery. If the panel supplys around 10amps (2) flowing to charge the battery, then you might loose 0.52volts at the battery terminals, so a 14.4v output from the panel would see only 13.88v at the battery. But if you add in 10AWG cable then a voltage drop of only 0.33v would result, giving 14.07V at the battery terminals.
The lower charging voltage, the more time spent charging, to reach full charge of the battery. In the end it may come down to cost, and your budget.
(1)
AWG - American Wire Gauge Current Ratings
(2)
200 Watt 12 Volt Monocrystalline Foldable Solar Suitcase (spec - Optimum Operating Current)