First of all let me say how much I admire your graciousness and openness to constructive criticism. I always get a little nervous when I post criticism that leans negative, and the kind words you had for me (in addition to the thoughtful responses) made me feel great.

One interesting way to look at the mechanic (I skirted around this in my last post, but failed to express it fully) is to look at the effects of choosing one stat over another - in this case, raising your ATK but keeping your DEX lower (than the opponent's AGI), versus raising your DEX but keeping your ATK at a smaller value. Just making up some numbers out of the blue:

- Raise ATK: You now do 120 damage instead of 100 as a base value, but your lower DEX will cause the variance to be -20% to -0%, meaning in practice you're doing 96 to 120 damage.
- Raise DEX: You still do 100 damage as a base value, but your higher DEX will cause the variance to be 0% to 20%, meaning in practice you're doing 100 to 120 damage.

One stat was marginally better in this example (and the player needed to understand the entire variance bias system and do the math to figure out which one), but essentially, the choice isn't that interesting because whether you're raising the damage on the "formula" side of the equation or the "variance" side of the equation, in the end you're doing the same thing (raising damage dealt with any given attack/skill).

As a disclaimer, I'd like to say AGI and DEX probably __are not a really significant stat__ for you to pay attention.

I always like to look at stats as a way of saying: "How good is a character at doing certain things?" The clearer the relationship between the stat and those things, the better. And that's part of why I feel that all stats should be significant and worth paying attention to.

- If the stat isn't significant to pay attention to because it doesn't do anything worth the player's attention (like RPG Maker's default LUK as used in state application), the stat should be discarded.
- If the stat isn't significant to pay attention to because it often will be equal to other battlers' stats (I believe this is your case based on what you said later in your post), and only a few equips/states/etc. will change it, then I usually like to move the effects from stats into Passives. In this case, you could have passives like "Gives your abilities a higher chance for positive variance" (to replace something that gives you higher DEX) or "Gives incoming hostile abilities a higher chance for negative variance" (replace something that raises your AGI).

In a game where speed/turn order is completely irrelevant, it's hard to "represent" that a character is agile. For example, in FTB (Free Turn Battle) or similar. You could argue that adding more action would represent it, but it actually breaks the game, but it's a topic for another thread.

I'm a big fan of trying to sell the dream of Agility, too.

A couple of the older

*Final Fantasy* games gave most attacks/skills multiple hits (for example a Sword Flurry type of attack might attack anywhere from 3 to 7 times), and the actual number of hits would be based on character stats. This is sort of like variance, and I think it's a good way to represent the concept in a slightly simpler, more satisfying way.

Ultimately, games that allow characters to move around a physical battlefield are going to be able to represent Agility in the most satisfying ways (but are so hard to get right in other ways), and among games that don't, the ATB system is probably the next-best thing for representing Agility.

For some skill that has a small window of variance, this is true. If a skill deals damage around 95 ~ 105, it's no use to pay attention to these stats. However, for a skill that has large variance, buffing the DEX stat or reducing the enemy AGI becomes more crucial. Say, a skill that deals 10 ~ 190, you have a skill to completely immobilize the target so that they have zero AGI, so whatever the attacker's DEX, they will guarantee to hit at maximum range, and that is not counting element modifier and maybe more.

I assume that 90% base variance (10-190), which means a wild 19x actual variance in outcome from worst to best (10 damage vs. 190 damage), is meant as an intentionally extreme example that you wouldn't ever actually put in your game... but even in this extreme example, I think my concern still stands. As long as a character's DEX is high enough not to hit the low end of the range, the most they'll ever get by increasing their DEX to dizzying heights is x1.90 damage, whereas by increasing their ATK to dizzying heights they might achieve x5.00 or x10.00 damage.

I'll admit there's a reasonable argument to be made that the question of whether to increase ATK (to slam low-AGI enemies) or whether to increase DEX (to not hit really low numbers against high-AGI enemies with high-variance skills) can be interesting, but it's a very opaque calculation (you don't know what enemy AGI will look like in advance when you're choosing which stat to build) and ultimately there are still other decisions which would probably be more interesting than "in which way do I deal more damage".

For some skill that has a small window of variance, this is true. If a skill deals damage around 95 ~ 105, it's no use to pay attention to these stats. ... In another hand, if the target has higher AGI, it might be better to attack it with a lower skill variance.

Interesting little story - in

*timeblazer*, the game I'm working on now, I had the same types of thoughts as you're expressing - for years, in fact. For a long time, I thought it would be neat to include a lot of stats, and make sure that each type of stat was relevant for different types of skills, so that part of the strategy was picking the right skills to use based on the stats you were building for a character. One way I went about this was to give my warrior characters (and enemies) "physical damage" skills which solely used ATK vs. DEF in the damage formula, and "mixed damage" skills which used ATK & MAG vs. DEF & MDF. Warrior equipment that raised MAG raised it by a lot, since only a small number of skills used it.

What I ended up finding through a lot of playtesting was that although this created interesting build paths, it became too limiting

*within* battle, and ended up working against my original design goal of interesting choices. If you built solely ATK, all of the mixed damage skills became weaker than similar physical damage skills. If you built a lot of MAG alongside your ATK, the mixed damage skills became so good that the physical damage skills weren't really worth using. It ended up limiting the number of "live" (good) options on any given turn of battle to the small subset of skills that could make good use of your stats. (Additionally, it was very hard to tell what type of damage enemies were inflicting, which meant less clarity for players as to why they were taking high or low amounts of damage.)

What I ended up doing was removing damage types entirely and reworking the entire stat system, combining ATK and MAG into a single stat (that both warriors and mages could use), as well as combining DEF and MDF into a single stat. For battlers where it really seemed like they should be doing better or worse against warriors or mages, I gave them Passive States (visible to the player) that reduce damage against either Physical or Elemental skills, which is a lot more intuitive than damage types. The rework took a while but the battle system plays out so much more clearly now. Since then, I've always advocated for reducing the number of stats in a game to the minimum number necessary for engaging and interesting choices to be available.

I see a strong parallel with what you have in mind with having high-variance damage skills and low-variance damage skills in a game where stats determine the bias for the variance calculation. You may be unintentionally limiting the player to the highest or lowest variance skills they know, rather than letting them choose from their entire repertoire of diverse skills.

I personally recommend the stat growth is to be something like this.

View attachment 110876
Aka, yes, it's FLAT. You control the character DEX/AGI manually through the game, so you don't need to balance out the numbers at a different level. As I said in the first point, these stats

__probably are not worth to pay attention__ when they are at the "standard number". i.e, you have 20 DEX and enemy has 20 AGI, both cancel out, and they variance played out like in vanilla.

However, maybe you have a state that boosts the DEX point by a lot, say 200 for several turns, then you may start to notice that the damage start to hit the higher number in the variance. And since these stats are not worth for your attention, applying the buff shouldn't feel like a waste of a turn either. Maybe while applying the buff, the skill is actually also do something else. Same goes to AGI buff.

The end goal of this concept is ultimately trying to control/minimize the variance. And have the player an access to do so when they want (like I said, applying buff/debuff)

While certainly a noble end-goal, a sincere question I have is: Do you think it's good design to force the player to maximize a stat in order to get consistent results (reliability), or do you think it would be better to start with minimal variance and allow the player to opt into it (e.g. with a weapon that deals lots of damage but sometimes misses)?

This is basically what my latest formula goes.

agi = 10

dex = 10

total = agi + dex

middle = dex / total.to_f

var = 0.5 - (0.5 - middle).abs

variance_rate = middle + rand*var - rand*var

I do like this formula probably the most out of everything we've discussed so far, though it does have a few quirks. (And I'm assuming you make a Sign check or an AGI(< or >)DEX comparison elsewhere, since in your code variance_rate will be the same with 10/90 vs. 90/10.

As mentioned above, it's worth considering that when AGI and DEX are the same (like they'll normally be), you'll end up encompassing the whole range of variance, meaning that if Variance is "worth considering", you'll probably end up with a very wide spread of damage in "normal" situations. This can make it very hard to balance - maybe the boss is tough but fair when you're dealing 100 damage per hit, but if you get unlucky a bunch and hit for 30 or 40 (or even 75) instead, you won't stand a chance against it.

There's also a sort of weird Diminishing Returns curve going on where, for example, if your DEX is one-ninth of the target's AGI, you'll deal an average of 0.1 of the total Variance damage (with a range between 0.0 and 0.2), if your DEX is one-fourth of the target's AGI, you'll deal an average of 0.2 (0.0-0.4), 0.3 requires a little less than half the target's AGI, and so on until you get to 0.5, after which it becomes a mirror (so four times the target's AGI gets you an average of 0.8 variance, and nine times the target's AGI gets you 0.9 variance). Intriguing, but not necessarily satisfying.

If I understand this correctly, theoretically, if the variance of a skill is 80 ~ 120, and both of the stats are equal, it means it behaves like default, but if the DEX is higher, then the variance is shifted like the maximum range is not 120, but (without putting the actual number of DEX and AGI), it can be 130?

Yes. In my first suggestion, the maximum amount of damage you could deal, if the target had 0 AGI, would be 140 (which is twice the listed Variance in the skill of 20). When AGI and DEX are equal, the listed Variance (20%) is the most it could shift either way, but as the ratios become extreme, it can nearly approach twice as much.

Frankly, I don't quite understand this example. Are you referring to my roulette selection formula?

No worries; it's always hard communicating technical concepts through words alone.

To give an example with some numbers, let's assume we have a skill with a formula that will do 100 damage, and it has 20% Variance listed. We've decided to make all skills use 8 variance draws. Assume that the attacker has 50 DEX and the target has 50 AGI. Since they're equal, we will divide the 8 variance draws up equally - 4 positive, and 4 negative. I'll choose random results for the Variance draws:

* Draw 1: From 0.0 to +0.05: +0.04

* Draw 2: From 0.0 to +0.05: +0.02

* Draw 3: From 0.0 to +0.05: +0.05

* Draw 4: From 0.0 to +0.05: +0.01

* Draw 5: From -0.05 to 0.0: -0.03

* Draw 6: From -0.05 to 0.0: -0.04

* Draw 7: From -0.05 to 0.0: -0.03

* Draw 8: From -0.05 to 0.0: -0.01

***TOTAL: +0.01

Therefore, the 100 damage will be multiplied by 1.01 to get 101 damage. On a really unlucky attack, the first four draws might be close to 0.0 and the last four close to -0.05, which would get you damage in the 80s.

Now, if the attacker has

**150** DEX and the target has 50 AGI, the attacker's DEX is three times as high, so three times as many variance draws should be positive as negative. In the case of 8 variance draws, that means 6 positive and 2 negative. Running the attack again with these numbers:

* Draw 1: From 0.0 to +0.05: +0.04

* Draw 2: From 0.0 to +0.05: +0.02

* Draw 3: From 0.0 to +0.05: +0.05

* Draw 4: From 0.0 to +0.05: +0.01

* Draw 5: From 0.0 to +0.05: +0.02

* Draw 6: From 0.0 to +0.05: +0.03

* Draw 7: From -0.05 to 0.0: -0.03

* Draw 8: From -0.05 to 0.0: -0.01

***TOTAL: +0.13

Now the 100 damage will be multiplied by 1.13 to get 113 damage. Having an imbalance of positive over negative variance draws made it much more likely (though not definite) that the output would come out in the positives.

My methods sometimes give either "good" or "bad" Variance modifiers until the ratios between DEX and AGI become truly extreme, which I view as an advantage of these methods over the ones you proposed. Your methods have advantages over mine, too, such as their ability to make an impact on battle strategy using less extreme differences between DEX and AGI. I think what we've each come up now with is pretty good as far as formulas go - it's just that in the same breath I'd entirely avoid using stats to bias Variance in the first place.