running into a snag with my dictionary thing, i tried to over-optimize it and ran into some kind of hash table maintainance nightmare that i can't seem to debug. i think i'll just rewrite it and give up some speed, i guess nobody minds an extra few hundred milliseconds on the first log load.
This sounds sweet. Keep up the good work.
progress is slow, moving soon, but i have auto-detection of fields working.
it also seems to detect single and dual bank closed loop configurations.
next i'll do some automatically filled filters. this will be like (pseudocode):
if (coolant temperature field exists) then add filter coolant temp > 80
but where i'm struggling is, how do i know the unit of measure?
i'm pondering a system where i'll run a min/max value tally when the thing is loaded, so at least i can 'guess' if it's farenheit or celcius.
the alternative is to leave it up to the user, but i find that to be not noob-proof enough
Last edited by steveo; 03-15-2017 at 11:01 PM.
... do we want knock event mapping?
i think we do.
ill copy that right out of eehack.
just watch this spiral out of control.
I'm an Excel user. The attached screen shots are from my WBO2 Tuning Spreadsheet. The example posted is the average if formula for 1000 RPM _ 50 Kpa filtering out AE / PE / Open Loop then using the Idle Flag for averaging the AFR's of RPM's greater than 900, RPM's less than 1100 and Kpa's greater than 45 and less than 55 Kpa's. The average if formula processes the data on the sheet named "Insert Datalog". Maybe this type of logic will help with your Narrow Band Tuning? There are similar formulas for Count If, along with Standard Deviation forumlas if desired.
dave w
you have a hell of a spreadsheet setup, dave. i did glance at one of your spreadsheets when i wrote eehack's analyzer routines, it verified that at least real tuners were doing what i assumed would work well. after filtering to determine valid data points, it basically does a cell selection routine then does a gigantic average. you're doing the same stuff, but in implementation you're using formulas, and i'm doing it functionally with operational loops on each row. the result is the same.
i will build my initial versions like that for sure
...but in the end my plan is to have a decaying radius of trim for each data point, kind of like a heat map, so each point will affect a spread of cells in that region instead of just a single cell selection. if tuned correctly this would allow a reasonable amount of interpolation in areas where data isn't present.
think about it from doing hand-tuning without such analysis tools, which we have all done. i want to mimic that effect, so that minimal initial data coverage can result in way more reasonable corrections. if you're making a 30% correction to a particular cell, that means that the adjacent cells are guaranteed (or almost guaranteed) to require at least 15% (or 1/2 of that adjustment) and even that 2 cells away might still require 5% adjustment. whereas if you're making a 5% adjustment to a cell it might be localized.
this might involve mapping all of the data at a much higher resolution, then downscaling it to the target size, which is pretty easy. kind of like painting a picture of your fueling corrections, then pixelating it at whatever resolution your VE or MAF table happens to be.
Bookmarks