Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/codedcontainer/useful-javascript-library
Useful set of tools to automate the creating of things using JavaScript Objects
https://github.com/codedcontainer/useful-javascript-library
Last synced: 5 days ago
JSON representation
Useful set of tools to automate the creating of things using JavaScript Objects
- Host: GitHub
- URL: https://github.com/codedcontainer/useful-javascript-library
- Owner: codedcontainer
- Created: 2015-05-29T14:12:05.000Z (over 9 years ago)
- Default Branch: master
- Last Pushed: 2016-02-17T16:15:26.000Z (over 8 years ago)
- Last Synced: 2024-10-10T13:43:40.799Z (about 1 month ago)
- Language: JavaScript
- Homepage:
- Size: 52.7 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Common JavaScript Objects for Common Tasks
Useful set of tools to automate the creating of things using JavaScript Objects.
I have been trying to keep everything written in plain JavaScript to allow flexibily between web applications. There are a few refrences to the JQuery library so it is important this be installed as well.1. A Href String Generator
2. BreadCrumb Creator
3. Simple HTML AJAX
4. Radio Button Value To DOM
5. Radio Button On Change
6. Grab and Serialize Form Data
7. Ajax Send w/ Modal Popup
8. Reorder A list with links
9. Dropdown Sub Menu Height
10. Equal Height of child divs
11. IE Image Replace## Additional Tools
ASP Send Mail
Inside of the "ASP Email Submit" directory there is a folder that will send any of your forms to the selected email using ASP. This can also be utilized with #6 and #7 of the JavaScript Object tools by sending the serialized form data to this file.
Steps for Execution:
1. Change the name values inside of this file
2. Add any additional CC's
3. Return false the form submit button
4. Send the serialzed data to the form using AJAX (#6 & #7)===
Node Web Scraper
The tools also includes a app.js file which incorporates Node.js and several modules to web scrape a website. The scrapper creates a start and end tag on each of the pages for each find and replace with use of any IDE of your choosing. This tools also scrapes files of various types such as .js and .css when it comes in contact with them. In addition, the scrapper grabs images and saves them to a seperate directory. If a div container is selected then these "extra" files will remain empty. Unless you need support for this you will need to incorporate conditions based on file types.
Steps for Execution:
1. Download all dependancies for the application using npm
2. Add a pages and images folder
3. Update the variables in app.js for the website of your choosing and the main div for scrapping===
Angular.js XML RSS Feed
Now you can grab an RSS feed and parse it's data using promises. This only goes through one RSS item. You will need to edit this code and add "angular.forEach" method for each xml node parent.
Steps for Execution
1. Make sure to install x2js script file
2. Copy and paste the following code into your app file
3. Rename the variables to reflect your model variable name
4. Add your XML script url to the $http request method
5. Update the properties from the XML to reflect the values you need to return