Get to know me!
Get to know me!
Employer Type: Orange County Public Schools
Your title: Certified Teacher
A thorough description of responsibilities:
Logging into a database of set curriculumn requirements. Comparing the reuirements each day, week, and unit to various curriculumn resources, Analysing available resources and materials for elements that will optimize the ability for all learning types to understand the material, then selcting material based on that.
Implementing a pareto based behavioral mangement plan to provide warnings to students who are acting outside of the rules and guidelines for an optimal classroom enviroment. Generate conduct data based on a three step warning system - yellow, red, your out or referred. Reiterating rules and expectations on a daily basis.
Using data collection material to measure student progress for analysis against requirements to design lerning processes that allow students at all levels to propel forward towards a successful completion of thier grade level year. Using data about test scores, assignment scores based on number of questions ansered and time to completion of accepotable score to understand student needs. Use student needs outcomes t add skills to learning material to increase the goals towardsseccussful completion of requirements.
Constant contact with parents and administrators about student success status and conduct.
Dates of employment: November 1, 2023 - present
Employer Type: Education Staffing
Your title: Substitute Teacher
A thorough description of responsibilities:
Logging into a database of teacher absences, completing a weekly full-time schedule combining full and half day assignments at various schools. Mapping out transportation requirements for each absence assignment considered to schedule a viable route to and from multiple locations to attend assigned appointments on time. Administering predesigned lesson plans in the form of classroom presentations - providing lectures and exercising visuals while showing each step of the assigned work or lesson plan. Engaging students to either answer questions or take on certain roles related to the presented materials, or a combination of both. Handing out work for students to complete and then closely monitoring student activities to ensure that the work is completed while answering any questions that students have to make sure they are fully able to complete the work required. Closely adhering to any accommodations plans that any student may have. Where a lesson plan does not exist, research the current topic that the students are studying and find materials or content to ad-lib my own lesson plan. Traveling to various locations within the county to complete assignments that I have registered for. Updating a billing database for time on assignments, time on breaks, and completion of assignments. Turning in timesheet each week to obtain gratituities.
Dates of employment: March 13, 2020 - present
Employer Type: Law Firm
Your title: Started as SEO Digital Optimization Specialist, moved into a role as an Analytics Administrator
A thorough description of responsibilities:
At first, it was an evaluation of the entire digital media infrastructure to determine areas of weakness and areas that needed improvement. The first step was to check the domain authority of the present website and to cross-compare with other leading law firm websites. Then engage in a study about why the domain was as it was. This included scraping data off the various social media sites and then watching how the stats raised and lowered, I was also going deeper into the segmentations of content types to find out what customers preferred. This employer wanted to be sure he was adhering to a specific cultural association the business belonged to and attending to citizens of that population that might be reading his content on social media sights, so I was also segmenting the data for patterns of interest and engagement by media participants specifically associating themselves with that culture separate from the general population and finding coherences between both populations.
Take note that for all the descriptions below, data were transformed into dashboards using pivot tables and worksheets to reference the pivot table to the particular worksheet and chart. These visuals made it easier to immediately find the overall patterns in the data. The pivot tables were visualized in the form of bar charts, line charts, and pie graphs. I was also able to create histograms to find the outliers.
I coded the content for different relative patterns and then charted things such as liked, engaged, and clicked through with the codes, so the database would have a column for the codes, then separate columns for the data related to the liked, engaged, and clicked through. There was a correlation calculation to determine where the impressions correlated with the coded data since some content had multiple codes. There were also codes generated for sentiment and the sentiment was used in correlation to the content codes to find if the impressions were positive or negative. There was also an outlier search. Where there were outliers, I went to the source of the data generating the outlier to find what the feedback was so that I could determine if the data was repairable, or else an indicator of the possibility of a new trend to jump on.
I judged websites by heat maps to determine what the customers were interacting with more and how long they stayed when they interacted and tested the funnel to see where they moved around based on the heat map data. For this, I would chart the location of the hot spot as a code in the coded column in the database, then I would chart the time of the interaction in another column, then I would chart whether or not they would go to another page using a 1 for yes or a 2 for no, then I coded the funnel using a similar number reference for where the visitor came to the site. The funnels were tested for correlations. The positive correlations were isolated to compare with the sentiment and impression charts to find correlations between those and the customer actually going to the site, staying for a little while, and exploring separate pages on the site. There was also a correlation created for the hot spots to determine if the content encouraging website visits encouraged reference to a specific spot on a page, and possibly why or how.
To increase the reach of each of the new content on their platforms, it was important to find keywords to include in the content that would draw an audience to view and read it. Such keywords can be introduced as hashtags and also as subtle wording included in the general content. There are keyword raters on most social media boost sites, such as SEM Rush and Ahrefs, and Moz. They typically gather data from search engines to find what consumers are looking for, plus how often they get a hit from it, and what the ads are worth that come through via that keyword. The engines then ascribe a cost-per-click statistic that rates keywords that are commonly searched but not highly used with a larger cost than those keywords used a lot that generates a lot of ads that a consumer would have to search through to find one particular ad. While it is essential to use the higher CPC words to beat competitors, there is sometimes a way to increase the return on ad spend by combining lower CPC keywords while using very small amounts of high CPC words. These CPC and keyword references are placed into an Excel sheet with estimated traffic, market CPC rates, and estimated ad strength. The return on ad spend is forecasted on the Excel sheet for each keyword.
The strength of a keyword can be determined without using CPC in some instances. There are also hashtag checkers that calculate how often a particular hashtag is used and then how many people are generated to it. These are generated into an Excel sheet and then compared via a pivot table to find what hashtag is used in low amounts with a high impression rate. The table is segmented where the use is separated by low quartile, second quartile, third quartile, and high quartile. The bars are displayed in each quartile to show the number of impressions for each hashtag. This is used in comparison to the keyword CPC rates and their accompanying return on ad spend to calculate a forecasted ad strength. Where a hashtag calculation references a keyword with low use but a high impression, yet there is a lower CPC rate than a hashtag that’s been heavily used, the keyword becomes a high-rank word for the present campaign. The reason is that it can become both a keyword and a hashtag, possibly a new brand term for the company.
Also, the hashtags that look to become good keywords are tested on the social media site to see what comes up for each word. The derived content is coded to determine any trends, and whether the hashtag would be disruptive for the law firm or if it would blend into what is already there. The optimal reference is that the hashtag would become disruptive, proving no correlation in a cross-comparison on an Excel sheet. The most disruptive high CPC keyword is also included as long as the return on ad spend and ad strength cohere with the disruptiveness of it, and it typically does but sometimes the given budget requires other areas of use that decrease the ROAS due to the CPC. Other times, the use of the keyword in other types of marketing medium increases the ROAS of the social media campaign by adding numbers for estimated traffic due to the possible use of that word to find the firm.
Content posting required that there be small summaries, hashtags, website links, or a combination of all three. To increase the value of each media post, it was important to complete a search on the backlinks to the website and any forward links within the website. The idea was to create a network where the forward links lead people back to the website or else lead them to a single resource they want that will also lead them back to the website to get more resources that are similarly something that they want. For precision, the link analysis was downloaded daily into an Excel sheet and monitored for changes and watched for correlation of movement of the DA.
Once the content was posted, it was time to go in on the platforms and check the impressions, engagements, click-throughs, heat interactions, etc. for the new content plan. Many of these stats are found right on the social site and can be transferred into a downloadable Excel or else cvs file. These can be analyzed in Excel or else uploaded into almost any analysis platform. Then the files were uploaded them programmed with calculations for mean, median, mode, correlations, ANOVA, MANOVA, and other important equations were solved, then transferred to visuals using pivot tables and linking the tables into a single dashboard with labels and explanations of what the data means.
After doing all of this for the law firm's sites and content only, another task was to create a comparison study with the "Top $30 Billion" firms. I upgraded this by also cross-comparing those top firms with the top firms on the social media platforms and with the highest DA. I then emphasized the top $30 bill firms that had the highest DA and most enthusiastic social media sites. It was the same type of study as completed for the law firm's sites, only without the ability to grab the back-office statistics. The visuals in this study reference a few different types of outcomes. First, there was a calculation based on impression and sentiment derived from a review of the content on its face. The outcomes were correlated to the number of followers on the different sites.
The coding would be, this number for this media site, this code on the site, this number of impressions, and the sentiment code for positive-1 or negative-2. The segmentation was in order reference to the website type, then ranked in order by the number of followers, and then the bar graph would display the sentiment. There were pie charts for each company tied into the bar graphs labels such as company 1 on Twitter with this number of followers and then segments in the pie were listed by the content codes with the positive sentiment for one pie and negative sentiment for another pie and number of comments for another and number of likes for another.
I also was able to complete a very small link analysis on these companies using the Ahref application. There the test was to find out what kind of correlations could be derived from past and current data about the links, the DA, and the optimal social media stats. Further, to conduct a cross-comparison of that data to the law firm's data. The idea was to generate a greater certainty of the correlation between an increased social media presence, link analysis, and DA. This was to find out if social media platforms could be geared to generate more links to and from the website, or if the number of links were correlated to a high DA.
The database was analyzed the same as in all other studies. There was a line chart analysis to watch the general change over time, with a link chart, a DA chart, a social media impressions chart, a social media sentiment chart, and a hits chart all on line graphs. Then this outcome was combined to show a correlation visualized together with a standard deviation slope. The findings discovered that all had a positive influence but most were not directly correlated. However, the social media did show a slow correlation when integer 1 for contained a link to outside media, an integer 2 contained a link to the website, and an integer 0 was included in the analysis. Typically, the link to the website and the link to outside media had similar outcomes. However, there was no testing of generated links to materials other than to show reference materials or else a material that the firm created as a reference to individuals hoping to evaluate their own case. The data showed that the created material did highly increase visitors to the site.
One beneficial finding was the way that a bot was being used on top-dollar websites. Another negative sentiment found in the Google reviews study was that clients felt that our customer service suffered. Adding a new bot to the website became a major task for the social media team.
My job then evolved. First, it was a discovery on the department budget. This was so I could estimate the benefits of joining different marketing organizations with any open funding or calculate changes to the budget for the ability to allow the team to join various possible marketing associations. The goal was for the team to win an award somewhere to get that superior marketing stat for the company. I used an Excel sheet to preview the budget, watch its flow from month to month, and estimate the long-term sustainability of it. The budget had some months where it ran into the red and somewhere it was way ahead. The budget was calculated by quarter rather than each month, this was the explanation. Where the expenses were in overage one month, they were made up the next month. The sheet had running calculations. I just added the figures from the different media associations that I found and tallied them in with the different payment types to find what would fit the budget for his proposal to the owner as we talked back and forth about it on company email with the owner tagged to the conversation as required.
After that, the owner moved me to a new department he created and named me the Analytics Administrator.
The first task there was to go through the car insurance and tags chart in Excel to determine which insurances were due for payment and which cars were due for tag renewal. Then create a query such that anything past due or due within the week would automatically highlight in red by first creating reference data that would automatically update to today's date and then calculating the days from the due day in absolute. The quarry was set to a range such that under a certain number of days would turn red, between a certain number of days and the number of days not to exceed before red were highlighted yellow, above that range the days were automatically highlighted green. A couple of dummies were created to set a date just a couple of days out, etc, to watch if the chart automatically changed.
The next was to take the data from all of the company's incomes, expenditures, and IOLTA holds to calculate the ROI and the monthly operational profit. For this task, I also had to calculate the expected earnings from settlements based on the settlement amount and interest and then obtain an IOLTA statement of expenses. The owner also wanted me to create a dashboard showcasing each separate attorney's settlement amounts, as well as the number of cases that were settled prior to litigation and the number settled after litigation. The compensation amount was determined based on a percentage of the settlement while the percentage was determined based on whether the case was settled prior to litigation or after litigation had begun.
There was a case sheet turned in for each case where the amounts for settlement were registered along with any expenses consumed during the case. For the settlement’s dashboard, I recorded only the Actual Settlement amount adjusted by the percentage that the firm earned from it. I used the other information when calculating the ROI and operating budget in the financials dashboard.
The accounting manager then wanted to delve into past records to gain a rough understanding of the general ebb and flow of the settlements at first to make predictions about future budgets and to find any outliers in the budget.
Another project was to generate the inbound calls rating sheet. The Excel sheet was used to log the time a call started, the time it ended, what the call was for, how the call was resolved, and any comments about the call. There was also a brief transcript of the call. I was able to generate the same range codes as previously related to the call times. The larger call times were the first for review. For these, the transcript was reviewed to determine why the call took as long as it did. Also, v-lookups were used to isolate the calls about intake from client calls. The intake calls had column entries related to why they needed our firm, including disqualifiers and referrals. I had to review these to find if the customer service representatives were properly disqualifying customers, and when they did if they were sending them referrals or just dismissing them off the phone. I then created an internal rating depending on the data collected during the review.
Another interesting task I had to do with the customer service department was to take the ratings and look at the ebb and flow of them over time - employee specific, plus as an overall team. I also did some comparisons with customer survey data to find if our ratings correlated with customer ratings. I also did a correlation to see first, if survey ratings correlated with the timing of the call, and second if our ratings coincided with the timing of the call. The same two correlational studies were conducted with the disqualifications and referrals data, a 1 for referred and a 2 for not referred, 1 for qualified and a 2 for not qualified. There were separate pivot tables for each placed into a dashboard with explanations. Once the paralegals' data was introduced, which is described next, there was also the ability to correlate the timing from qualification to case opening, including attorney reach out, to customer satisfaction.
The next task was a review of employee billing logs, including a description of the work completed. This task was slightly more complicated than some of the others because it required an analysis of the text descriptions of each employee using their individual interpretations of the tasks they did. Although the language of each was slightly different, those descriptions had to be labeled with the same codes, with such codes being derived from the descriptions. Also, the billings were used to determine the time it took to complete each task. The codes were translated as a count by adding a column next to the data, then inputting a 1 into each row. Excel was asked to count each 1 based on the word in the adjoining column. From this data, pivot charts were created to see the number of times each task was completed and a correlational line graph was also created to determine how closely each type of task determined the time to completion. These were cross-examined for outliers to determine employee efficiency, and also a cross-examination between a number of tasks and time-to-completion correlations helped to determine if any specific task should be created as its own department or left in the queue for paralegals.
The next task was to interview all of the employees to create notes about what the employees' tasks were. Then create a Pareto-style coding system where each of the employee departments would have two tasks to report on via email a the end of the day. The tasks were quantifiable. There were either time reports, numbers reports, or yes or no that were translated into a 1 or 2. They were translated into an Excel Sheet. From there, a dashboard was created for each department to measure the efficiency of task completion. Over time, what amount was feasible to complete in a day? Also, over time, what was the time pattern for completion, and how many times were employees able to hit segmented goals? Another task was to look at a histogram line chart to find if any employees were either positive or negative outliers. This allowed for further research into those statistics while deriving a clear understanding of what was feasible under the general population of hired employees for each department.
The owner of the law firm was interested in winning an award for something. He requested a preliminary study of what the top firms in the $30 bil study were getting rewards for. This entailed a search of the websites, social media sites, and a firm search in rewards directories. There was some simple coding with numbered tallies, including a bar chart that was referenced to a pie chart. This study found that the most common reward for these firms was as superior employers, with a top category of employee engagement. From there, I searched multiple rewards sites for a list of criteria required to obtain an award as an outstanding employer. The recommendation was sent to the owner that he could use the paid option that also included a paywall in order to begin to qualify for the nominated option. A reference to the criteria for the paid site was sent as well for a goal of within 6 months to achieve the award. Then a criteria coding study was begun for the nominated site, which had shown a correlation to World News Reports' top law firms criteria in a private study of firms who had won between both sites, with a results dashboard also going to the owner.
The chosen nominated reward platform was then coded according to the actions listed that gained winners for the past four years. The codes were scraped from the descriptions in the winnings on the website. Further, the companies that won were researched for their company website and their social media to find out what kind of activities, engagements, advertisements, blogs, or company policies were listed that coincided with the listed reason that they won. From this, generalized codes were depicted and then tallied into an Excel sheet to discover what efforts were most prevalently the cause of winning. This allowed for the development of a list of internal criteria, with action steps, to allow the firm to follow a plan for a timeframe of two years out to win the nominated, free award.
Another important Excel task was the tracking sheet for network connections. For this sheet, there was a general workbook for connections in the past or present. Then another to track compliance with the networking agreement. Here, attorneys and doctors that were used as referrals from the firm or agreed to send referrals to the firm were charted. Each area where one of the firm offices was located had a separate workbook with a separate worksheet for the month of the year. The referral date from the present firm to one of the networked entities was marked with a date. There was a column for first, second, and third referrals. These entities also had columns for referring to the firm. Each row was completed with an adjoining 1 tally row. The tallies were calculated downward to see which months had a majority reach their goal and which months were lagging for goal attainment. The totals were linked into a pivot chart to visualize each month together in a comparison of sums. There was also a row inserted after the tallies on the third date for both outbound and inbound referrals with a color alarm to turn green once a goal was met. These tallies were also charted in a monthly combined dashboard to find patterns among partners over time.
Finally, the owner wanted a dashboard reflecting attorney compliance with bonus incentives. For this, each attorney was listed as the name of the row. Each column was labeled according to the required incentive. Where there were multiple times a particular activity had to be engaged in to meet the incentive, there was a separate column for each time. These columns were filled in with the date the incentive was met, and a second column was used beside each with a 1 tally. This was linked to a dashboard that showed the sum across for each employee, and it also showed the complete value for each incentive. The owner was able to quickly view which attorneys were completing each month as well as which incentive activities were being completed the most. These were also used in an employee dashboard that likewise included a pie chart of litigation versus prelitigation settlements for each attorney, a pie chart of the percentage of each settlement range for each attorney, and a display of the outcome of a linked calculation of overall profits each month for each attorney.
Dates of employment: October 2019 - March 2020
Employer Name: Speech and Presentation Leadership Development Club
Your title: Vice President of Education
A thorough description of responsibilities:
As the Vice President of Education of a Toastmasters club, it was my task to make sure that members of the group achieved their educational goals. It was also my job to make sure that they set those goals initially. I was to assign those who needed it a mentor or a coach and set up contests as required for members to advance to hire level versions of those contests. There were a set number of goals required to achieve certain designations for the club that are awarded designating the club as superior to other clubs. These were Distinguished, Special Distinguished, and President's Distinguished, and some clubs also earn a Smedley Award. I had to track the achievements to measure them against these requirements, then come up with a plan for reaching the goals the club had for their specific end level.
Leadership Team members of individual clubs have access to an online chart of all members' pathways, where they are at in their levels, what exact projects they've completed, how long they've been on a project, and any achievements they've earned. This detailed depiction can be downloaded into an Excel sheet. I was able to successfully download this report to display current and past achievements, with timing, levels, and comparisons. I would use the Excel sheet under v-lookups to find which members were at levels 1, 2, 3, 4, or 5. Also, which members had achieved at least one pathway completion. I could test the timing of a specific project to determine if I needed to speak to a specific member to administer a plan to help them achieve a project goal that may be unachievable. If I see a group of members at one level and needing to overcome that milestone, I was able to host a special meeting for only that level for speech completion, with club awards incentives. These are some examples of the way Excel was used for an organization like that.
For an informational study, I did do a graphical review of pathways and levels completion. I did a v-lookup, attached a pivot chart to the v-lookup results, then completed countif tallies to create a bar graph to deeply review members at different years of membership to compare their completion levels. Also, to compare number of full pathways completed to the current level completed. I also did a number of years to number of pathways completed cross-study.
I also used Excel for simple tracking of judges when I served as the chief judge for content. For this task, I was able to chart contact information, when certain forms were turned in, when certain training was completed, who rsvp'd for planning events, and who arrived at what meeting. I also used it to track club goals and milestones to achieve. Once I spoke with a member and they mentioned their goals, we made a plan to break down those goals and placed them on an Excel sheet. The club has goals listed together. This way I could group certain goals that were shared for the purpose of creating other events where those goals were showcased as a theme and incentivized for completion.
Dates of employment: February 2021 - October 2022
Employer Name: Staffing Services
Your title: Regulatory Compliance Specialist
A thorough description of responsibilities:
This position required the ability to sort and reconcile data. It was a small temporary work project for Kelly Services. An employee was leaving and there were a lot of materials that still needed updates on their UBI labeling after ISO 13485 was updated to require universal over American labeling IDs. The task was to sort through the Excel sheet using color codes and v-lookups to update the UBIs in the Excel sheet that weren’t updated there but were updated on the registration site and locate specific products to their respective original manufacturer such that a list of that manufacturer's products was generated with the new UBI. There were certain codes that were used, and original manufacturer product IDs that described them with one manufacturer over another. Those that had UBIs were listed by whether they were registered vs whether they still needed registration, those without were registered the same - whether they were registered, whether they needed registration, and whether they still needed a UBI associated with them. They were also sub-sorted to be approved or disapproved labels as the labels were printed for actual products with another color coding. Once color-coded, they were placed on separate worksheets to depict their origin manufacturer.
Dates of employment: June 2019-July 2019
Employer Name: University
Your title: Quantitative Research Analyst - Student
A thorough description of responsibilities:
Create a survey where the results can be transformed into a quantitative analysis. Calculate the required number of responses to achieve a power of at least 93% accuracy during randomization. Conduct the survey. Download survey questions and responses into an Excel sheet. Clean the data. Conduct ANOVA calculations of columns to be combined because the questions represented the same leadership style or else trust type to make sure the ANOVA calculation proved a lack of variance. Check for outliers using a line graph with point locations turned on in a separate Excel worksheet to make sure the data was smooth with a similar pattern. Combine columns through median response calculations of the average rating. Place the averages in a column beside each group. Transfer the data into SPSS for further calculation of MANOVA to test for moderating effects.
Dates of employment: August 2018 - May 2019
Employer Name: University
Your title: Research Analyst - Student
A thorough description of responsibilities:
Generate Excel worksheet to depict cost analysis from income and balance sheets of multiple public stock companies. Determine the solvency and cross-compare for future M&A prospects. Create a decision tree analysis of low, moderate, and high demand and calculate production and maintenance costs for the current year and the total for 5 years out. Complete a complete predictive sheet for return on income for the last three years and then predictions for future years out using EBITDA and CAPEX calculations and generating a general operating revenue from past overall revenue. Generate SEC reports from the data discovered during financial statement calculations and derive data for futures predictions from past publicly filed SEC reports. Also, environmental toxicology arrays using Excel with Crystal Reports to generate a pdf document of calculations regarding the build-up of contaminants based on the levels of multiple sources of different pollutants. These were completed with pivot tables to track changes in toxicity based on fluctuations of the chemicals tested and to reflect randomized sampling changes.
Simple tracking of product qualities, international policies, and statistics was kept in Excel. Further, a log of annotated bibliography references and notes was kept in Excel.
Dates of employment: August 2010 - May 2014, Financial in
Employer Name: Staffing Services
Your title: Media Analyst
A thorough description of responsibilities:
Code content on social media sites and place codes in Excel. Then use a number tally of each coded reference. Include a separate column for the number of likes, number of comments, and positive and negative sentiments. Create a pivot table to show a bar graph depicting the uses of different coded references. Cross-compare impression data using correlation analysis. Create a pivot pie chart of a sum of impressions split into coded references to compare what gets attention and what doesn't. Create a pivot pie graph of the sum of sentiment categories with coded references to determine what gets positive versus negative feedback.
Use Excel to chart content text in one column, content picture link in another, date of post in another, and category in another. Upload to social media manager for automatic scheduling, cross compare automatic to ensure upload was accurate. Pull statistics off of the Hubspot media manager to review impressions, web hits, click-throughs, and sentiment. Download to Excel. Check data on a line graph enabling dot references, and smooth data. Create a visual dashboard using pivot tables. Upload to Power BI for a more simplistic visual reference and a cross-comparison over time.
Dates of employment: September 2018 - March 2019
Employer Name: IT Support Services Organizations
Your title: Customer Dispute Specialist (Remote)
A thorough description of responsibilities:
This was mostly qualitative data analysis with some quantitative. The position required the use of spark notes. Each employee had to attempt to code their tasks in a format that could easily be extracted then analyzed for efficiency purposes. The reports were discovered for patterns related to time efficiency, survey scores, and end resolution. Employees also had to mark if the issue was resolved. There was a timer on each call. Survey results were marked in a 1-10 for effectiveness format. Results were provided to each employee at the end of the period in a google sheet easily downloaded to Excel. I was able to take the results sheets, encoded with employee numbers to remove identifiers, and compare my results with my peers. I was able to compare timing, resolution, and survey results to find patterns of correlation. This helped me to increase my overall KPI scores more effectively during evaluations.
Furthermore, I was able to take my text spark notes and select patterns in the report to compare with my Excel ratings findings. This helped me to both learn what consumers typically needed when they called, what patterns of resolution worked best with the quickest turnaround time, and what things to avoid. I was able to take this data and coach newcomers to reach my scores. I was also able to request their information on the ratings sheets to compare them for a plan of action to coach them into higher roles on the team.
Another benefit was taking textual analysis reports to create new tags for easier search results in the knowledge base. The easier it is to find the search results, the quicker the process for resolving customer issues, and the better my end score or my teammate's end score. Furthermore, the data was helpful in creating new suggestions for knowledge base editions, with some explanations and/or process architectures included.
Dates of employment: March 2016 - July 2018
Employer Name: Engineering Firm
Your title: Environmental Compliance Specialist (Scientist 1)
A thorough description of responsibilities:
Generating detailed permitting plan schedules to convert into Access via macros for a variety of reports - person to tasks, tentative year of the task, the month of the task, department of the task, and description of the task. Creating front-end to back-end real-time task and investigation reports to use for future permitting reports - the Excel sheet has rows and columns to input for reports on the Sharepoint generated website front end as an input form and on the backend it goes to Excel with color coding for input, processing, completed. Excel was used to chart data from water bodies regarding pollutants, water pressure, and depth. The data was placed within a pivot table as a line graph and then linked to a PowerPoint for visual display in a presentation.
Excel was also used for cost estimates to develop a feasible mitigation plan for a Title V Air Permit by placing a range indicator on the sheet then including products with their costs calculated over time and the predicted mitigation. The indicator was green within the modal area of the range, yellow at the outer limits of the range, and red outside of the range. The range was for discharge amounts and costs over time. Excel was also used to chart utilities costs calculations where current average costs by location and included expenditure was listed and proposed increases were tested to find an acceptable itemized increase that would fit within allowable ranges using similar indicators. The estimates were similarly placed within line graphs to depict an overtime supply and demand shift for the purpose of determining incentives for decreased usage towards the offered alternative.
Dates of employment: June 2007 - April 2008
Employer Name: Real Estate Time Share
Your title: Timeshare Representative
A thorough description of responsibilities:
Tracking a log of sales leads to call when not on the floor. This includes current ownership status, cost of the real estate, and estimated income. Logging verification of data, date of in-person contact or set appointment, and the result of both in-person and phone contact. Tracking a cost estimate sheet for each timeshare presented, including a maximum sell price, a first-level, second-level, and third-level deal, and any combination deals. Tracking sale or no sale and what type.
Dates of employment: May 2014 - August 2014
Employer Name: Charity Fundraising
Your title: Event Co-Chair and Advocacy Chair
A thorough description of responsibilities:
For Event Co-Chair - tracking teams that have signed up for the event as an Excel download from the site. Tracking member numbers, fundraising goals, and current amounts collected. Keep a running tally. Create a pivot table of teams to show a bar chart. Of goals met per member and a pie chart of the percentage of team goals met. Also, a pivot table of all team funds raised as a bar chart for the highest team stats and a pie chart to determine the percentage of this year's predetermined goal. Also used Excel to find networking contacts and chart contact results - answered, left message, busy signal, disconnected, do not call back, wrong number, no answer. Tracked donations and translated item amount into expenditures for the purpose of tracking the highest donations. Tracked monetary donations into the goal amount chart and pie chart unless used for expenditures.
Tracked required votes in Excel. Created signal for green once required amount reached. Charted donations from votes in another row. Created a sum total of donations received. Created a bar graph o donations to votes to determine how much was donated per vote overall.
Dates of employment: May 2006 - November 2006 & May 2018 - November 2018
I possess a sharp eye for detail, which I use to find even the smallest errors in text. I work well under pressure and can produce high-quality work in short periods of time. I have strong interpersonal skills and work with a wide variety of people.
I am looking for an opportunity to work with a team that runs on transparent, clear communication. I want to align myself with a company I believe in and where I can create positive change. I am always looking to learn more and am open to taking on challenging projects.
Bonnie Aylor
Copyright © 2024 Bonnie Aylor - All Rights Reserved.
Powered by GoDaddy Website Builder
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.