Frontend Engineer | Experience: 3+ Years
Development | NOIDA | India | Salary Negotiable | Published Date: 11th July’24
Apply NowWith Offices in Texas, London and New Delhi, Lepide is a global IT security organization, leading the data-centric audit and protection (DCAP) market with the award-winning Lepide Data Security Platform. Our mission is to change the way most organizations protect their unstructured data. By putting data at the centre of their IT security strategy, they can ensure they are tackling data security at the source of the problem. We help organizations by providing enterprise level insight into data and the surrounding systems, whether on-premises or in the cloud. Statistically:
- We have over 1000 customers across 150 countries globally.
- We have unprecedented customer support with 99% customer satisfaction.
- We are the fastest growing provider of DCAP in the market today.
Position Overview
We are seeking skilled Software Development Engineers to join our dynamic team focused on developing an AI backed Analytics, Alerting & Monitoring Stack for a product aimed at enhancing cybersecurity & compliance measures in on-premises / cloud environments. This position will play a key role in designing and implementing algorithms to analyse incoming logs from various sources to identify potential risks to user's IPR & documents. This is a B2B Product suite to address various Security & Compliance aspects like HIPAA, GDPR, OWASP etc.
Key Responsibilities
- Design, develop, and implement platform agnostic Products for Lepide.
- Develop scalable and efficient software modules to implement new capabilities into existing Lepide products and new initiatives.
- Optimize user experience for performance and accuracy, and maintainability on Local & Remote Environments alike.
- Work collaboratively with cross-functional teams including data engineers & scientists, other software developers, and Business Analysts & Product Managers to ensure seamless experience to the end user.
- Build, Operate & Maintain the Product Line on the Team’s Charter end-to-end following the established PDLC.
Required Skills and Qualifications
- 3+ years of experience in frontend development.
- JavaScript / TypeScript and modern libraries ( React ) and/or frameworks ( Vue.js )
- Strong DSA Skills & Design Patterns.
- Bachelor’s degree in computer science, Software Engineering, or a related field.
- Solid problem-solving skills and ability to work independently as well as part of a team.
- Effective communication skills with the ability to convey complex technical information clearly and concisely.
- Developing RESTful Architecture & RESTful APIs, Open API Specifications,
- Any of the Build tools like Webpack, Babel, ninja, Cmake, Nmake, make etc.
- Web Testing frameworks (Jest, JMeter Mocha, Selenium)
- Experience with version control systems (e.g., Git).
- Any CI/CD pipeline on Github, Gitlab or Jenkins.
- Browser based developer tools
- Strong understanding of web development best practices and .
- Agile Methodologies
Nice to Have
- Familiarity with Productivity Tools like JIRA, Confluence, Markdown UML etc.
- Familiarity with Cross-Platform Mobile App Development on react-native.
- Familiarity with iOS & Android.
- Familiarity with on-premises infrastructure and platform-agnostic deployment methodologies likeDocker/Kubernetes et.al.
- Familiarity with Micro-Services.
- Familiarity with PDLC, SDLC.
- Familiarity with Storage Technologies like RDBMS, NoSQL, In-Memory
- Ability to think like the User of Products to enhance the UX of Company Products.
- Knowledge of cybersecurity concepts and best practices applicable to on-premises deployments.
- Contributions to open-source projects or publications in relevant conferences/journals.
- Strong understanding of MSSQL/PostgreSQL/mySQL Server and other database management systems, including log analysis techniques in on-premise environments.
- Experience with AI frameworks such as TensorFlow, PyTorch, or similar.
- Familiarity with data engineering tools such as Apache Kafka, Apache Spark, or Apache Hadoop for managing and processing large datasets.