php - Improve speed cross-mapping array of arrays -


just looking little conversion perl php. utilized hashes map values keys across 2 arrays read in 2 files. files using aren't big, 150,000 rows in one, , 50,000 in other. in perl, runs in 10 seconds, in php i've reduced read-in file 150,000 rows around 20,000 rows , takes 3 minutes. i'm wondering if limitation of language or if design inherently flawed.

the 2 existing array of arrays $ao_hash , $string_hash, built follows:

// load file contents $file_contents = str_replace("\t","|",file_get_contents($_files['file']['tmp_name'])); $file_array = explode("\n",$file_contents);  // pass client dictionary array of arrays foreach ($file_array $line) {     $line_array = explode("|",$line);     if (stripos($line_array[0], 'mnemonic') !== false) {          continue;      }      if (!isset($line_array[1])) {         continue;     }      if (stripos($line_array[1], 'n') !== false) {         continue;     }      if (!isset($line_array[10])) {         continue;     }      $ao_hash[$line_array[10]] = $line; } 

both hashes built using method, , both work (expected results, quick execution). reads this:

$array1[ndc] = some|delimited|file|output $array2[ndc] = another|file|with|delimited|output 

i'm using ndc primary key cross-map both arrays.

// compare client's drug report against cut-down file while (list ($key, $value) = each ($ao_hash)) {      // use ndc match across array of arrays     if (isset($string_hash[substr($key,0,11)])) {         $string_selector = $string_hash[substr($key,0,11)];     }      // check if client ndc entry exists in cut-down file     if (!isset($string_selector)) {          // no direct ndc match, reserve fsv look-up         $ao_array = explode("|", $value);         if (isset($ao_array[2]) && isset($ao_array[16])) {             $no_matches[$ao_array[2].'|'.$ao_array[16]]['ndc'] = $ao_array[10];             $no_matches[$ao_array[2].'|'.$ao_array[16]]['mnemonic'] = $ao_array[0];         }     } else {          // direct match found         $ao_array = explode("|", $value);         $cutdown_array = explode("|", $value);         foreach ($cutdown_array $cutdown_col) {             if ($cutdown_col == "") {                 $cutdown_col = "0";             }             $cutdown_verified[] = $cutdown_col;         }          // drop last column         array_pop($cutdown_verified);          // merge single string         $final_string = implode("|", $cutdown_verified);          // prepare data fsv match         if (isset($ao_array[2]) && isset($ao_array[16])) {             $yes_matches[$ao_array[2].'|'.$ao_array[16]]['drug_string'] = $final_string;         }          // add mnemonic end         $final_string .= '|'.$ao_array[0];         $drug_map[$ao_array[0]] = $final_string;     } } 

any awesome, run faster.

redditor https://www.reddit.com/user/the_alias_of_andrea solved issue:

instead of using:

while (list($key, $value) = each($ao_hash)) 

it more efficient use

foreach ($ao_hash $key => $value) 

now 13mb file executes , expected results.


Comments

Popular posts from this blog

javascript - jQuery: Add class depending on URL in the best way -

caching - How to check if a url path exists in the service worker cache -

Redirect to a HTTPS version using .htaccess -