Interesting task for programmers

T7
Site user since 19.09.2018
Offline
31
#31
Danforth:
On pyhe way you can wrap the decoder through the yield may be faster

Maybe, but that is a significant generator win dubious. There is little resources yes 0 (0.36 / 2 (memory_get_usage / memory_get_peak_usage)).

Where you can win. On operations:

At knee 10 min 35 lines of code:


test
json_decode x 10000: 16.0880 ms

json_encode x 3000: 4.3969 ms

file_append [(json x 10 x 00)]: 5.7662 ms

file_append [(json x 3000)]: 55.0900 ms

Also read blocks (where 50-60ms) potential only in the recording position. Not 3,000 times file_put_contents ($ file, $ dd, FILE_APPEND);

but, for example 300, but 10 found again. In the first case I have not been recording, in the second, which is more difficult to jsonami packs of 500 found

I still support the idea that, it is necessary to decode the json, so 10,000 times have to parse. This gives flexibility and versatility.

 $ D = ' 
{
"Id": 47704,
"Scores": 0.7003193510956659
} ';
$ T0 = microtime (true);
for ($ i = 0; $ i <10000; $ i ++) {
$ J = json_decode ($ d);
}
echo sprintf ( "json_decode x 10000:% 01.4f ms",
(Microtime (true) - $ t0) * 1000), "\ n \ n";

print_r ($ j);
$ T0 = microtime (true);
for ($ i = 0; $ i <3000; $ i ++) {
$ S = json_encode ($ j);
}
echo sprintf ( "json_encode x 3000:% 01.4f ms",
(Microtime (true) - $ t0) * 1000), "\ n \ n";

$ Dd = str_repeat ($ d, 10);
$ File = '/ var / web / aio / data / json-22';
$ T0 = microtime (true);
file_put_contents ($ file, '['); for ($ i = 0; $ i <300; $ i ++) {
file_put_contents ($ file, $ dd, FILE_APPEND);
}
file_put_contents ($ file, ']', FILE_APPEND);
echo sprintf ( "file_append [(json x 10 x 00)]:% 01.4f ms",
(Microtime (true) - $ t0) * 1000), "\ n \ n";

$ T0 = microtime (true);
file_put_contents ($ file, '['); for ($ i = 0; $ i <3000; $ i ++) {
file_put_contents ($ file, $ d, FILE_APPEND);
}
file_put_contents ($ file, ']', FILE_APPEND);
echo sprintf ( "file_append [(json x 3000)]:% 01.4f ms",
(Microtime (true) - $ t0) * 1000), "\ n \ n";

exit ();
HM
Site user since 14.01.2012
Offline
208
#32
Danforth:
Why do not you agree?

Because it is necessary to take not a single line of 50 characters, and something more;

further erased, because I am not a programmer and argue laziness.

---------- Posted 15.06.2020 at 21:51 ----------

Danforth:
It's not the hundredth,

It's not the hundredth, and conditional SO hundredth. Sucked from the finger task and write a new solution, but all have been written for a long time there, and libs.

If we are really straight, then sly32 bombanulo the level of this forum section (which is logical, given the two dunce sevlad and sitealert) and he decided to start code to be measured. But I think (yes you and know) that this is not the forum for such a level.

T7
Site user since 19.09.2018
Offline
31
#33
hakuna matata:
Sucked from the finger task and write a new solution, but all have been written for a long time there, and libs.

The theme should be treated as a warm-up, so it is interesting. Well, the exchange of views, for those who do not

hakuna matata:
I'm not a programmer

And so, now look:

hakuna matata:
They are fully in Google https://github.com/salsify/jsonstreamingparser

 "Require": { 
"Php": "^ 7.2",
"Twig / twig": "^ 2.0",
"Box / spout": "^ 3.0",
"Endroid / qr-code": "*",
"salsify / json-streaming-parser ": "*"
}

Vpendyurivaem

 header ( 'content-type: text / plain'); 
$ Testfile = '/ var / web / aio / data / json';
$ Listener = new \ JsonStreamingParser \ Listener \ InMemoryListener ();

logger ( 'Before JsonStreamingParser \\ Parser');
print_r (end ($ GLOBALS [ 'aapp_timing']));
$ Stream = fopen ($ testfile, 'r');
try {
$ Parser = new \ JsonStreamingParser \ Parser ($ stream, $ listener);
$ Parser-> parse ();
fclose ($ stream);
} Catch (Exception $ e) {
fclose ($ stream);
throw $ e;
}
logger ( 'After JsonStreamingParser \\ Parser' );
print_r (end ($ GLOBALS [ 'aapp_timing']));

var_dump ($ listener-> getJson ());

It does not make anything, ($ listener-> getJson () after the logger), just set up, but added 243.5660 (ms) and 4+ meters

 Array 
(
[Msg] => Before JsonStreamingParser \ Parser
[Afterstart] => 3.1090 (ms)
[Afterprevios] => 0.0880 (ms)
[Mem_peak] => 0.51 / 2.00 (mb)
)
Array
(
[Msg] => After JsonStreamingParser \ Parser
[Afterstart] => 246.6750 (ms)
[afterprevios] => 243.5660 (ms )
[Mem_peak] => 5.21 / 4.00 (mb)
)
array (10000) {
[0] =>
array (2) {
[ "Id"] =>
int (21675)
[ "Scores"] =>
float (0.27687288427211)

Code logger functions:

 function logger ($ message) 
{
if (_ DEBUG!) return 0;
if (! array_key_exists ( 'aapp_timing', $ GLOBALS)) {
$ GLOBALS [ 'aapp_timing'] = [];
$ Last = $ GLOBALS [ 'aapp_last_timing'] = _STIME;
} Else $ last = $ GLOBALS [ 'aapp_last_timing'];
$ T1 = microtime (true);
$ M = (memory_get_usage () / 1024) / 1024;
$ Mp = (memory_get_peak_usage (true) / 1024) / 1024;
$ M = sprintf ( "% 01.2f /% 01.2f (mb)", $ m, $ mp);
$ GLOBALS [ 'aapp_timing'] [] = [ 'Msg' => $ message,
'Afterstart' => __timer ($ t1, _STIME),
'Afterprevios' => __timer ($ t1, $ last),
'Mem_peak' => $ m
];
$ GLOBALS [ 'aapp_last_timing'] = $ t1;
}
png hakuna.png
T7
Site user since 19.09.2018
Offline
31
#34
hakuna matata:
this is not the forum for such a level.

So it is necessary to raise the level. So the issue fully.

D
Site user since 18.12.2015
Offline
142
#35
hakuna matata:
Because it is necessary to take not a single line of 50 characters, and something more;

There's no difference, though large, though small line. And yes, the test is really a synthetic, but it's just an example of what a fast language in some tasks are not always quick in the other.

hakuna matata:
It's not the hundredth, and conditional SO hundredth. Sucked from the finger task and write a new solution, but all have been written for a long time there, and libs.

Liba under a lot of that is, it would be foolish to create a theme, where everyone would be connected libu and made a call to one function. The essence of the theme to pump skill and chat on the topic of code and development (what we do, in principle).

Разработка и поддержка высоконагруженных проектов.
S
Site user since 30.09.2016
Offline
459
#36
hakuna matata:
given two Dolby ** ERD sevlad and sitealert

Contact the person's disease. They'll help you. And plenty of rest, we are all of you are going through.

hakuna matata:
this is not the forum for such a level.
Well your a level known. Be able to understand this piece of code? : D
 echo 'Tupaya Kozlina'; 
Отпилю лишнее, прикручу нужное, выправлю кривое. Вытравлю вредителей.
HM
Site user since 14.01.2012
Offline
208
#37
Danforth:
fast language in some tasks are not always quick in the other.

And you do not take re2 difficult now to go your code, and check again? I just look at it and do not see much of a difference in speed with pcre :)

D
Site user since 18.12.2015
Offline
142
#38
hakuna matata:
And you do not take re2 difficult now to go your code, and check again?

It is difficult to really connect re2 not so easy, you need to be compiled as an libu, then linked into the binary. Because the Go no ported re2. It will be free time -'ll dig.

hakuna matata:
I just look at it and do not see much difference in the speed with pcre


Go without compilation: 0.005807088 seconds
Go to compilation: 0.000459663 seconds (x12 speedup)
PHP: 0.000177860 seconds (x2.5 speedup, speedup x32)

And so? It is clear that most of the applications are doing something else besides performing regex. In fact, there are still a few places where PHP faster Go (where the problem is bad beats on flows and where the function call is described in sishnom module). For example, any TLS implementations can be faster in pyhe only them there by themselves, and no place to apply.

S3
Site user since 29.03.2012
Offline
212
#39

It does not work like a bug cause asynchronous code

HM
Site user since 14.01.2012
Offline
208
#40
Danforth:
Because the Go no ported re2. It will be free time -'ll dig.

No, I made a mistake and thought that he was there.

Actually I'm surprised about the go, "I thought to be much better than all" (c).

And at the same time advise on what spamilku write. The task - just send 1 post request and parse the response. Now I send via php-curl, ceto is already at 5k flows eats percent and memory (I generally every request - 1 process php-cli, know what's funny). I wrote above that are not a programmer, I think that's something to read, but do not know that :) node or the second?

I can certainly nakostylit something in php to guzzle, but it is interesting to listen to your advice.

To post a new comment, please log in or register