mirror of
https://github.com/SpacehuhnTech/esp8266_deauther.git
synced 2026-04-26 16:25:54 +03:00
[GH-ISSUE #121] Full APScan Async List #83
Labels
No labels
best of
bug
development
discussion
documentation
duplicate
feature request
help wanted
help wanted
improvement
pinned
pull-request
question
stale
translation
v3
wontfix
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/esp8266_deauther#83
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @tobozo on GitHub (Mar 13, 2017).
Original GitHub issue: https://github.com/SpacehuhnTech/esp8266_deauther/issues/121
Originally assigned to: @tobozo, @spacehuhn on GitHub.
Ability to get the full AP Scan has not been reached yet, here are the milestones:
@spacehuhn commented on GitHub (Mar 14, 2017):
I think I might found a solution to the problem. We can still send it as one file, instead of cutting it after the maxsize and then asking via JS for every left peace, we can send the full string in splittet parts with the webserver. The browser will recognize it as one file.
I'll show you guys what I mean later. Just got this working on an other project where it's necessary to view large files, which can be bigger than the RAM.
@spacehuhn commented on GitHub (Mar 14, 2017):
Ok I guess this isn't easy to implement right away due to the current programm structure, however that's what I did in my other project:
A few preparations:
esp8266\hardware\esp8266\2.0.0\libraries\ESP8266WebServer\srcESP8266WebServer.hvoid _prepareHeaderfunction declaration underprotected:and paste it underpublic:To the actual code:
I used a buffer array like we have in data.h:
char data_websiteBuffer[6000];And here is an example function which generates a random 11500 byte long text file and sends it to the user:
and then of course:
server.on("/test", test);@tobozo commented on GitHub (Mar 14, 2017):
How about using this ?
[edit] bonus with websockets !
@spacehuhn commented on GitHub (Mar 14, 2017):
Ok I guess that would be way easier and better... You can also load websites directly out of the progmem!
@tobozo commented on GitHub (Mar 14, 2017):
oh this means a complete rewrite of the html/js am I right?
Currently the project is not very friendly with frontend dev iterations, I had to use SerialServer + persistence to get the opportunity to modify the files without reflashing to see my changes and it's soo f*****g slow I'll probably close the Full Serial Peering feature request :-)
But the good thing is a nodejs app would eventually bring the ability to modify html/js/css without reflashing the ESP on every change
It would also enable the possibility to regenerate data.h (or recompile & flash the ESP but that's not the point).
How do you handle your frontend iterations? Do you also minify/convert/flash manually on every change?
@spacehuhn commented on GitHub (Mar 14, 2017):
Yes I do this everytime. But unless it's something with JavaScript, I test it within the browser with its developer options beforehand.
I don't think we should rewrite everything now, that would be soooooo much work!
The only thing I really wanna see fixed is the APScan list, just some workaround to the current problem - it doesn't need to be pretty, just functional.
One way is the async framework you linked, another would be using my webserver 'hack'. I don't know what's easier though.
I don't have much time to code at the moment and I wanna move on to other projects.
@spacehuhn commented on GitHub (Mar 16, 2017):
I can't get the async library working with the 2.0.0 SDK 😞
@tobozo commented on GitHub (Mar 16, 2017):
The buffering hack solves the problem for big files coming from data.h, but doesn't solve the problem in getAPScanResults(), where the huge JSON is built and crashes.
Back to pagination then?
@spacehuhn commented on GitHub (Mar 16, 2017):
I think the hack can solve that problem too if it writes into the same website buffer instead of a temporary string. But I haven't had time to try it out yet.
@spacehuhn commented on GitHub (Mar 16, 2017):
I found a way to use my 'hack' without a change in the library files.
My little tuturial:
https://gist.github.com/spacehuhn/6c89594ad0edbdb0aad60541b72b2388
@spacehuhn commented on GitHub (Mar 16, 2017):
Fixed :D
a71946d09aIt actually was because of the string!
I added 3 new functions to data.h:
sendHeadermust be called first, then copy every string (or substring, when generating json files) withsendToBufferand if everything is done callsendBuffer.@tobozo commented on GitHub (Mar 19, 2017):
Nice job! I wish I could test it but I just saw my last wemos die in a tiny mushroom cloud of toxic smoke 🔥 🍄 ☁️ . The deauther is still running well but it's an older version and I can't update it anymore (CH340 dead I guess).
Looking at this thread when using
Transfer-Encoding: chunkedyou don't have to calculate the size before sending. (TL;DR just send aContent-Length:0when it's finished). I'm not sure this applies to sdk2.0.0 but this could save you from looping twice in the results.@tobozo commented on GitHub (Mar 31, 2017):
Closing this thread as very long APSCan list is no longer a stability problem, tested on WemosMini D1 and NodeMCU DevKit V3 without a glitch \o/
@spacehuhn commented on GitHub (Mar 31, 2017):
Great! I'm very happy that it works now!
I saved the linked thread, however I'm busy with other things right now so I couldn't test it yet.