javascript - Filtering my data causes the y-axis to get screwed up -


i creating column chart w/ d3. have 263 data points, , showing columns makes chart crowded. filter datapoints, grab every nth item (starting reverse of array, ensure recent datapoint).

i define y axis tick values include min , max unfiltered dataset, user can see real min , max of dataset. calculate min , max before filter data:

var v = new array(); data.foreach(function (d) {     d.date = parsedate(d.date);     d.close = +d.close;     v.push(d.close); // v holds our values...before filter them }); yaxisvalues = [math.min.apply(null, v),math.max.apply(null, v)];  if(data.length > 100){ // make chart isn't crowded          var len = data.length;         var n = math.round(len/100); // ideally, want no more 100 items          var tempdata = [];         data = data.reverse();          for( var k = 0; k < len; k += n ){              tempdata.push( data[ k ] );         }         data = tempdata;         data = data.reverse();     } 

however, y-axis screwed up, -0.02 showing below x-axis. did wrong? fiddle. (to see y-axis behave normally, comment out part filter data)

you creating y axis values before filtering, still creating scale on filtered data:

var y = d3.scale.linear().range([height - 5, 5]); // here @ min/max of filtered data rather min/max of original y.domain(d3.extent(data, function (d) {     return d.close; })); var yaxis = d3.svg.axis().scale(y).orient('left').tickvalues(yaxisvalues); 

if move part before filtering should ok.


Comments